You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by Shyam Sarkar <sh...@yahoo.com> on 2009/02/03 20:25:32 UTC
Eclipse run fails !!
Hello,
I have hive project loaded inside eclipse 3.4.1 and hadoop 0.19.0 is running in the background. I could create tables from bin/hive command.
But when I try to run->run inside eclipse it says::
"Errors exist with required project(s):
hive
Proceed with launch ?"
and then it gives many errors.
Can someone please tell me why there are errors in project hive ? I followed all steps correctly from hive wiki.
Regards,
shyam_sarkar@yahoo.com
RE: Eclipse run fails !!
Posted by Shyam Sarkar <sh...@yahoo.com>.
Hi,
>From my eclipse test Run, I found 18 failures.
========================================================================
TestTruncate
org.apache.hadoop.hive.service.TestHiveServer
warning(junit.framework.TestSuite$1)
junit.framework.AssertionFailedError: Exception in constructor: testExecute (java.lang.NullPointerException
at org.apache.hadoop.hive.service.TestHiveServer.<init>(TestHiveServer.java:38)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at junit.framework.TestSuite.createTest(TestSuite.java:135)
at junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
at junit.framework.TestSuite.<init>(TestSuite.java:75)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3TestLoader.java:102)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit3TestLoader.java:59)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:445)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
)
at junit.framework.Assert.fail(Assert.java:47)
at junit.framework.TestSuite$1.runTest(TestSuite.java:263)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
warning(junit.framework.TestSuite$1)
junit.framework.AssertionFailedError: Exception in constructor: testMetastore (java.lang.NullPointerException
at org.apache.hadoop.hive.service.TestHiveServer.<init>(TestHiveServer.java:38)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at junit.framework.TestSuite.createTest(TestSuite.java:135)
at junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
at junit.framework.TestSuite.<init>(TestSuite.java:75)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3TestLoader.java:102)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit3TestLoader.java:59)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:445)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
)
at junit.framework.Assert.fail(Assert.java:47)
at junit.framework.TestSuite$1.runTest(TestSuite.java:263)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
warning(junit.framework.TestSuite$1)
junit.framework.AssertionFailedError: Exception in constructor: testFetch (java.lang.NullPointerException
at org.apache.hadoop.hive.service.TestHiveServer.<init>(TestHiveServer.java:38)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at junit.framework.TestSuite.createTest(TestSuite.java:135)
at junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
at junit.framework.TestSuite.<init>(TestSuite.java:75)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3TestLoader.java:102)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit3TestLoader.java:59)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:445)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
)
at junit.framework.Assert.fail(Assert.java:47)
at junit.framework.TestSuite$1.runTest(TestSuite.java:263)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
warning(junit.framework.TestSuite$1)
junit.framework.AssertionFailedError: Exception in constructor: testDynamicSerde (java.lang.NullPointerException
at org.apache.hadoop.hive.service.TestHiveServer.<init>(TestHiveServer.java:38)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at junit.framework.TestSuite.createTest(TestSuite.java:135)
at junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
at junit.framework.TestSuite.<init>(TestSuite.java:75)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3TestLoader.java:102)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit3TestLoader.java:59)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:445)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
)
at junit.framework.Assert.fail(Assert.java:47)
at junit.framework.TestSuite$1.runTest(TestSuite.java:263)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
junit.framework.TestSuite
testCliDriver_implicit_cast1(org.apache.hadoop.hive.cli.TestCliDriver)
junit.framework.AssertionFailedError: Unexpected exception
at junit.framework.Assert.fail(Assert.java:47)
at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_implicit_cast1(TestCliDriver.java:1304)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
testCliDriver_input16(org.apache.hadoop.hive.cli.TestCliDriver)
junit.framework.AssertionFailedError: Unexpected exception
at junit.framework.Assert.fail(Assert.java:47)
at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input16(TestCliDriver.java:1760)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
testCliDriver_input16_cc(org.apache.hadoop.hive.cli.TestCliDriver)
junit.framework.AssertionFailedError: Unexpected exception
at junit.framework.Assert.fail(Assert.java:47)
at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input16_cc(TestCliDriver.java:1798)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
testCliDriver_input2(org.apache.hadoop.hive.cli.TestCliDriver)
junit.framework.AssertionFailedError: Unexpected exception
at junit.framework.Assert.fail(Assert.java:47)
at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input2(TestCliDriver.java:1988)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
testCliDriver_input3(org.apache.hadoop.hive.cli.TestCliDriver)
junit.framework.AssertionFailedError: Unexpected exception
at junit.framework.Assert.fail(Assert.java:47)
at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input3(TestCliDriver.java:2140)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
testCliDriver_input3_limit(org.apache.hadoop.hive.cli.TestCliDriver)
junit.framework.AssertionFailedError: Unexpected exception
at junit.framework.Assert.fail(Assert.java:47)
at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input3_limit(TestCliDriver.java:2178)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
testCliDriver_input4(org.apache.hadoop.hive.cli.TestCliDriver)
junit.framework.AssertionFailedError: Unexpected exception
at junit.framework.Assert.fail(Assert.java:47)
at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input4(TestCliDriver.java:2216)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
testCliDriver_input4_cb_delim(org.apache.hadoop.hive.cli.TestCliDriver)
junit.framework.AssertionFailedError: Unexpected exception
at junit.framework.Assert.fail(Assert.java:47)
at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input4_cb_delim(TestCliDriver.java:2254)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
testCliDriver_input_part5(org.apache.hadoop.hive.cli.TestCliDriver)
junit.framework.AssertionFailedError: Unexpected exception
at junit.framework.Assert.fail(Assert.java:47)
at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_part5(TestCliDriver.java:2786)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
testCliDriver_inputddl5(org.apache.hadoop.hive.cli.TestCliDriver)
junit.framework.AssertionFailedError: Unexpected exception
at junit.framework.Assert.fail(Assert.java:47)
at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_inputddl5(TestCliDriver.java:3166)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
testCliDriver_nullinput(org.apache.hadoop.hive.cli.TestCliDriver)
junit.framework.AssertionFailedError: Unexpected exception
at junit.framework.Assert.fail(Assert.java:47)
at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_nullinput(TestCliDriver.java:4458)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
testCliDriver_udf_testlength(org.apache.hadoop.hive.cli.TestCliDriver)
junit.framework.AssertionFailedError: Unexpected exception
at junit.framework.Assert.fail(Assert.java:47)
at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_udf_testlength(TestCliDriver.java:5408)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
org.apache.hadoop.hive.ql.TestMTQueries
testMTQueries1(org.apache.hadoop.hive.ql.TestMTQueries)
junit.framework.AssertionFailedError: One or more queries failed
at junit.framework.Assert.fail(Assert.java:47)
at org.apache.hadoop.hive.ql.TestMTQueries.testMTQueries1(TestMTQueries.java:53)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
org.apache.hadoop.hive.metastore.MetaStoreTestBase
warning(junit.framework.TestSuite$1)
junit.framework.AssertionFailedError: No tests found in org.apache.hadoop.hive.metastore.MetaStoreTestBase
at junit.framework.Assert.fail(Assert.java:47)
at junit.framework.TestSuite$1.runTest(TestSuite.java:263)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
======================================================================
Thanks,
Shyam
--- On Wed, 2/4/09, Ashish Thusoo <at...@facebook.com> wrote:
> From: Ashish Thusoo <at...@facebook.com>
> Subject: RE: Eclipse run fails !!
> To: "hive-dev@hadoop.apache.org" <hi...@hadoop.apache.org>, "shyam_sarkar@yahoo.com" <sh...@yahoo.com>, "Raghu Murthy" <ra...@facebook.com>
> Date: Wednesday, February 4, 2009, 2:19 PM
> yes. we will address them some time in future. A lot of them
> are to do with idiosyncracies of Java generics...
>
> Ashish
>
RE: Eclipse run fails !!
Posted by Ashish Thusoo <at...@facebook.com>.
yes. we will address them some time in future. A lot of them are to do with idiosyncracies of Java generics...
Ashish
-----Original Message-----
From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
Sent: Wednesday, February 04, 2009 1:44 PM
To: hive-dev@hadoop.apache.org; Raghu Murthy
Subject: Re: Eclipse run fails !!
Dear Raghu and others,
Please ignore my previous e-mail. When I did Build all, all errors were gone. However I am still getting following warnings ::
Reference to generic type should be parameterized -- it is happening 1707 times. Should it be corrected in future ?
============================================================
Description Resource Path Location Type
AbstractList is a raw type. References to generic type AbstractList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 361 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 131 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 135 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 139 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 143 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 143 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 234 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 306 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 307 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 370 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized JJTthrift_grammarState.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 13 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized JJTthrift_grammarState.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 14 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized OptionsProcessor.java hive/cli/src/java/org/apache/hadoop/hive/cli line 76 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized OptionsProcessor.java hive/cli/src/java/org/apache/hadoop/hive/cli line 76 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ScriptOperator.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 397 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ScriptOperator.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 397 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized Utilities.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 249 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized Utilities.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 250 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized Utilities.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 250 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized thrift_grammar.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 2283 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ByteStreamTypedSerDe.java hive/serde/src/java/org/apache/hadoop/hive/serde line 70 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ColumnInfo.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 56 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 158 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 186 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 193 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 200 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 377 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 389 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 404 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 412 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ConstantTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 65 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ConstantTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 70 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ConstantTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 74 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ConstantTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 78 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeStructBase.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 50 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeBase.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 44 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeBool.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 70 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeDouble.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 67 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeList.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 40 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeMap.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 46 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeMap.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 48 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeMap.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 49 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeSet.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 49 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeSet.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 51 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeString.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 45 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypei16.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 37 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypei32.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 65 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypei64.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 37 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized FetchTask.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 112 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized FetchTask.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 113 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized FetchTask.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 115 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized HiveInputFormat.java hive/ql/src/java/org/apache/hadoop/hive/ql/io line 148 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized HiveInputFormat.java hive/ql/src/java/org/apache/hadoop/hive/ql/io line 149 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized HiveInputFormat.java hive/ql/src/java/org/apache/hadoop/hive/ql/io line 151 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized HiveInputFormat.java hive/ql/src/java/org/apache/hadoop/hive/ql/io line 173 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized HiveInputFormat.java hive/ql/src/java/org/apache/hadoop/hive/ql/io line 209 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized JuteSerDe.java hive/serde/src/java/org/apache/hadoop/hive/serde/jute line 79 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized JuteSerDe.java hive/serde/src/java/org/apache/hadoop/hive/serde/jute line 96 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized MapOperator.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 96 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized MetadataTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 44 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized MetadataTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 93 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized MetadataTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 98 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized MetadataTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 102 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized MetadataTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 106 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized PrimitiveTypeInfo.java hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo line 39 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized PrimitiveTypeInfo.java hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo line 52 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized PrimitiveTypeInfo.java hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo line 58 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized PrimitiveTypeInfo.java hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo line 66 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized RandomDimension.java hive/ql/src/java/org/apache/hadoop/hive/ql/metadata line 32 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 30 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 31 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 36 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 37 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 41 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 81 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 83 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 84 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 119 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 123 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 131 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 139 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 68 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 73 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 78 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 83 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 33 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 33 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 35 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 43 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 85 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 122 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 123 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized TReflectionUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 27 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized TReflectionUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 32 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized TReflectionUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 45 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ThriftByteStreamTypedSerDe.java hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line 100 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ThriftByteStreamTypedSerDe.java hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line 124 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ThriftSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line 33 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized UDFIf.java hive/ql/src/java/org/apache/hadoop/hive/ql/udf line 67 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized Utilities.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 493 Java Problem
===================================================================
Thanks,
Shyam
--- On Wed, 2/4/09, Shyam Sarkar <sh...@yahoo.com> wrote:
> From: Shyam Sarkar <sh...@yahoo.com>
> Subject: Re: Eclipse run fails !!
> To: "hive-dev@hadoop.apache.org" <hi...@hadoop.apache.org>, "Raghu Murthy" <ra...@facebook.com>
> Date: Wednesday, February 4, 2009, 1:35 PM
> Dear Raghu and others,
>
> I removed @Override directives and error count came down to
> three. I am still getting following error with override
> issues ::
>
> Description Resource Path Location Type
> The method getEvalMethod(List<Class<?>>) of
> type ComparisonOpMethodResolver must override a superclass
> method ComparisonOpMethodResolver.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 54 Java Problem
> =============================See the following
> class==================
>
> public class ComparisonOpMethodResolver implements
> UDFMethodResolver {
>
> /**
> * The udfclass for which resolution is needed.
> */
> private Class<? extends UDF> udfClass;
>
> /**
> * Constuctor.
> */
> public ComparisonOpMethodResolver(Class<? extends
> UDF> udfClass) {
> this.udfClass = udfClass;
> }
>
>
> /* (non-Javadoc)
> * @see
> org.apache.hadoop.hive.ql.exec.UDFMethodResolver#getEvalMethod(java.util.List)
> */
> /* @Override */
> public Method getEvalMethod(List<Class<?>>
> argClasses)
> throws AmbiguousMethodException {
> assert(argClasses.size() == 2);
>
> List<Class<?>> pClasses = null;
> if (argClasses.get(0) == Void.class ||
> argClasses.get(1) == Void.class) {
> pClasses = new ArrayList<Class<?>>();
> pClasses.add(Double.class);
> pClasses.add(Double.class);
> }
> else if (argClasses.get(0) == argClasses.get(1)) {
> pClasses = argClasses;
> }
> else if (argClasses.get(0) == java.sql.Date.class ||
> argClasses.get(1) == java.sql.Date.class) {
> pClasses = new ArrayList<Class<?>>();
> pClasses.add(java.sql.Date.class);
> pClasses.add(java.sql.Date.class);
> }
> else {
> pClasses = new ArrayList<Class<?>>();
> pClasses.add(Double.class);
> pClasses.add(Double.class);
> }
>
> Method udfMethod = null;
>
> for(Method m: Arrays.asList(udfClass.getMethods())) {
> if (m.getName().equals("evaluate")) {
>
> Class<?>[] argumentTypeInfos =
> m.getParameterTypes();
>
> boolean match = (argumentTypeInfos.length ==
> pClasses.size());
>
> for(int i=0; i<pClasses.size() && match;
> i++) {
> Class<?> accepted =
> ObjectInspectorUtils.generalizePrimitive(argumentTypeInfos[i]);
> if (accepted != pClasses.get(i)) {
> match = false;
> }
> }
>
> if (match) {
> if (udfMethod != null) {
> throw new AmbiguousMethodException(udfClass,
> argClasses);
> }
> else {
> udfMethod = m;
> }
> }
> }
> }
> return udfMethod;
> }
>
> }
> =============================================================
>
> Other two errors are ::
>
> Description Resource Path Location Type
> The method getEvalMethod(List<Class<?>>) of
> type NumericOpMethodResolver must override a superclass
> method NumericOpMethodResolver.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 52 Java Problem
> The method getEvalMethod(List<Class<?>>) of
> type UDFIf.UDFIfMethodResolver must override a superclass
> method UDFIf.java hive/ql/src/java/org/apache/hadoop/hive/ql/udf line
> 81 Java Problem
>
> ======================================================
>
> Thanks,
> Shyam
>
>
>
> --- On Tue, 2/3/09, Raghu Murthy <ra...@facebook.com>
> wrote:
>
> > From: Raghu Murthy <ra...@facebook.com>
> > Subject: Re: Eclipse run fails !!
> > To: "hive-dev@hadoop.apache.org"
> <hi...@hadoop.apache.org>,
> "shyam_sarkar@yahoo.com"
> <sh...@yahoo.com>
> > Date: Tuesday, February 3, 2009, 5:35 PM
> > Shyam,
> >
> > Please go ahead and remove the @Override directives.
> This
> > is an issue with
> > eclipse. Versions prior to 3.4.1 actually add
> @Override to
> > classes which
> > implement interfaces. Version 3.4.1 correctly
> complains
> > that @Override
> > should not be present for methods from interfaces.
> >
> > Can you submit a jira so that we can fix this?
> >
> > raghu
> >
> >
> > On 2/3/09 5:28 PM, "Shyam Sarkar"
> > <sh...@yahoo.com> wrote:
> >
> > > Dear Prasad,
> > >
> > > I did a clean and then performed build all for
> project
> > hive. I am getting 10
> > > errors and 1706 warnings. All errors are about
> > "must override a superclass
> > > method". It seems to be compiler problem. I
> have
> > added jre1.6.0_11 in build
> > > JRE. Why is the following problem coming ?
> > >
> > > UDFMethodResolver is an interface::
> > >
> > > public interface UDFMethodResolver {
> > >
> > > public Method
> > getEvalMethod(List<Class<?>> argClasses)
> > > throws AmbiguousMethodException;
> > > }
> > >
> > > Following method should override above method ::
> > >
> > > public Method
> getEvalMethod(List<Class<?>>
> > argClasses)
> > > throws AmbiguousMethodException {
> > > assert(argClasses.size() == 2);
> > >
> > > List<Class<?>> pClasses = null;
> > > if (argClasses.get(0) == Void.class ||
> > > argClasses.get(1) == Void.class) {
> > > pClasses = new
> > ArrayList<Class<?>>();
> > > pClasses.add(Double.class);
> > > pClasses.add(Double.class);
> > > }
> > > else if (argClasses.get(0) ==
> argClasses.get(1)) {
> > > pClasses = argClasses;
> > > }
> > > else if (argClasses.get(0) ==
> java.sql.Date.class
> > ||
> > > argClasses.get(1) ==
> java.sql.Date.class)
> > {
> > > pClasses = new
> > ArrayList<Class<?>>();
> > > pClasses.add(java.sql.Date.class);
> > > pClasses.add(java.sql.Date.class);
> > > }
> > > else {
> > > pClasses = new
> > ArrayList<Class<?>>();
> > > pClasses.add(Double.class);
> > > pClasses.add(Double.class);
> > > }
> > >
> > > Method udfMethod = null;
> > >
> > > for(Method m:
> > Arrays.asList(udfClass.getMethods())) {
> > > if
> (m.getName().equals("evaluate")) {
> > >
> > > Class<?>[] argumentTypeInfos =
> > m.getParameterTypes();
> > >
> > > boolean match = (argumentTypeInfos.length
> ==
> > pClasses.size());
> > >
> > > for(int i=0; i<pClasses.size()
> &&
> > match; i++) {
> > > Class<?> accepted =
> > >
> >
> ObjectInspectorUtils.generalizePrimitive(argumentTypeInfos[i]);
> > > if (accepted != pClasses.get(i)) {
> > > match = false;
> > > }
> > > }
> > >
> > > if (match) {
> > > if (udfMethod != null) {
> > > throw new
> > AmbiguousMethodException(udfClass, argClasses);
> > > }
> > > else {
> > > udfMethod = m;
> > > }
> > > }
> > > }
> > > }
> > > return udfMethod;
> > > }
> > >
> > > }
> > >
> > >
> > >
> > > =====================Errors and
> > Warnings=======================
> > > Description Resource Path Location
>
> > Type
> > > The method add_partition(Partition) of type
> > MetaStoreClient must override a
> > > superclass method MetaStoreClient.java
> > >
> >
> hive/metastore/src/java/org/apache/hadoop/hive/metastore
> > line 466
> > > Java Problem
> > > The method
> getEvalMethod(List<Class<?>>)
> > of type ComparisonOpMethodResolver
> > > must override a superclass method
> > ComparisonOpMethodResolver.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 54 Java Problem
> > > The method
> getEvalMethod(List<Class<?>>)
> > of type NumericOpMethodResolver must
> > > override a superclass method
> > NumericOpMethodResolver.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 52 Java Problem
> > > The method
> getEvalMethod(List<Class<?>>)
> > of type UDFIf.UDFIfMethodResolver
> > > must override a superclass method UDFIf.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/udf
> line
> > 81 Java Problem
> > > The method process(Node, NodeProcessorCtx,
> Object...)
> > of type
> > > TypeCheckProcFactory.BoolExprProcessor must
> override a
> > superclass method
> > > TypeCheckProcFactory.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
>
> > line 205 Java
> > > Problem
> > > The method process(Node, NodeProcessorCtx,
> Object...)
> > of type
> > > TypeCheckProcFactory.ColumnExprProcessor must
> override
> > a superclass method
> > > TypeCheckProcFactory.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
>
> > line 245 Java
> > > Problem
> > > The method process(Node, NodeProcessorCtx,
> Object...)
> > of type
> > > TypeCheckProcFactory.DefaultExprProcessor must
> > override a superclass method
> > > TypeCheckProcFactory.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
>
> > line 584 Java
> > > Problem
> > > The method process(Node, NodeProcessorCtx,
> Object...)
> > of type
> > > TypeCheckProcFactory.NullExprProcessor must
> override a
> > superclass method
> > > TypeCheckProcFactory.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
>
> > line 94 Java Problem
> > > The method process(Node, NodeProcessorCtx,
> Object...)
> > of type
> > > TypeCheckProcFactory.NumExprProcessor must
> override a
> > superclass method
> > > TypeCheckProcFactory.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
>
> > line 121 Java
> > > Problem
> > > The method process(Node, NodeProcessorCtx,
> Object...)
> > of type
> > > TypeCheckProcFactory.StrExprProcessor must
> override a
> > superclass method
> > > TypeCheckProcFactory.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
>
> > line 163 Java
> > > Problem
> > > AbstractList is a raw type. References to generic
> type
> > AbstractList<E> should
> > > be parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 361 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 131 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 135 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 139 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 143 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 143 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 234 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 306 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 307 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 370 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized JJTthrift_grammarState.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 13 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized JJTthrift_grammarState.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 14 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized OptionsProcessor.java
> > > hive/cli/src/java/org/apache/hadoop/hive/cli
> line
> > 76 Java Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized OptionsProcessor.java
> > > hive/cli/src/java/org/apache/hadoop/hive/cli
> line
> > 76 Java Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ScriptOperator.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 397 Java Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ScriptOperator.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 397 Java Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized Utilities.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 249 Java Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized Utilities.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 250 Java Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized Utilities.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 250 Java Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized thrift_grammar.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 2283
> > > Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ByteStreamTypedSerDe.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 70 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ColumnInfo.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 56 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 158 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 186 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 193 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 200 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 377 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 389 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 404 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 412 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ConstantTypedSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 65 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ConstantTypedSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 70 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ConstantTypedSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 74 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ConstantTypedSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 78 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeStructBase.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 50 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypeBase.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 44 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameerized DynamicSerDeTypeBool.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 70 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypeDouble.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 67 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypeList.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 40 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypeMap.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 46 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypeMap.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 48 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypeMap.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 49 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypeSet.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 49 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypeSet.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 51 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypeString.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 45 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypei16.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 37 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypei32.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 65 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypei64.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 37 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized FetchTask.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 112 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized FetchTask.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 113 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized FetchTask.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 115 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized HiveInputFormat.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/io
> line
> > 148 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized HiveInputFormat.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/io
> line
> > 149 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized HiveInputFormat.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/io
> line
> > 151 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized HiveInputFormat.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/io
> line
> > 173 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized HiveInputFormat.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/io
> line
> > 209 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized JuteSerDe.java
> > >
> hive/serde/src/java/org/apache/hadoop/hive/serde/jute
> > line 79 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized JuteSerDe.java
> > >
> hive/serde/src/java/org/apache/hadoop/hive/serde/jute
> > line 96 Java Problem
> > > Class is a raw type. References to generic type
> > Cass<T> should be
> > > parameterized MapOperator.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 96 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized MetadataTypedSerDeField.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> > line 44 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized MetadataTypedSerDeField.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> > line 93 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized MetadataTypedSerDeField.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> > line 98 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized MetadataTypedSerDeField.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> > line 102
> > > Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized MetadataTypedSerDeField.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> > line 106
> > > Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized PrimitiveTypeInfo.java
> > >
> hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo
> > line 39 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized PrimitiveTypeInfo.java
> > >
> hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo
> > line 52 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized PrimitiveTypeInfo.java
> > >
> hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo
> > line 58 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized PrimitiveTypeInfo.java
> > >
> hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo
> > line 66 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized RandomDimension.java
> > >
> hive/ql/src/java/org/apache/hadoop/hive/ql/metadata
> > line 32 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 30 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 31 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 36 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 37 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 41 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 81 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 83 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 84 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 119 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 123 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 131 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 139 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > >parameterized SerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 68 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 73 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 78 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 83 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 33 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 33 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 35 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 43 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 85 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 122 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 123 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized TReflectionUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 27 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized TReflectionUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 32 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized TReflectionUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 45 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized
> ThriftByteStreamTypedSerDe.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line
> > 100 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized
> ThriftByteStreamTypedSerDe.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line
> > 124 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ThriftSerDeField.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line
> > 33 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized UDFIf.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/udf
> line
> > 67 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized Utilities.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 493 Java Problem
> > >
> > >
> >
> =======================================================================
> > >
> > > Do you have any suggestion ?
> > >
> > > Thanks,
> > > Shyam
> > >
> > >
> > >
> > > --- On Tue, 2/3/09, Prasad Chakka
> > <pr...@facebook.com> wrote:
> > >
> > >> From: Prasad Chakka
> <pr...@facebook.com>
> > >> Subject: Re: Eclipse run fails !!
> > >> To: "shyam_sarkar@yahoo.com"
> > <sh...@yahoo.com>,
> > >> "hive-dev@hadoop.apache.org"
> > <hi...@hadoop.apache.org>
> > >> Date: Tuesday, February 3, 2009, 4:57 PM
> > >> There are compilation errors in Hive project
> so
> > that is why
> > >> running tests is causing issues. Could you
> send
> > what are the
> > >> compilation errors?
> > >> One of the errors should be on following
> line.
> > Itmost
> > >> probably a Eclipse and java issue. You can
> most
> > probably
> > >> remove the @override annotation and get
> successful
> > >> compilation. If there are any more errors
> send
> > them to us.
> > >>
> > >> The method
> > getEvalMethod(List<Class<?>>) of
> > >> type NumericOpMethodResolver must override a
> > superclass
> > >> method
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
> > >> pMethodResolver.java:52)
> > >>
> > >>
> > >> ________________________________
> > >> From: Shyam Sarkar
> <sh...@yahoo.com>
> > >> Reply-To: <sh...@yahoo.com>
> > >> Date: Tue, 3 Feb 2009 16:51:47 -0800
> > >> To: <hi...@hadoop.apache.org>,
> Prasad
> > Chakka
> > >> <pr...@facebook.com>
> > >> Subject: Re: Eclipse run fails !!
> > >>
> > >> Dear Prasad,
> > >>
> > >> I followed your instructions with 0.17.2.1
> hadoop
> > version
> > >> and changed jre to version 1.6_11. When I ran
> > JUnit test, I
> > >> still got the following message :
> > >>
> > >> "Errors exist in required Project(s):
> > >> hive
> > >> Proceed with Launch ?"
> > >>
> > >> When I launched I got following errors ::
> > >> =================================== It is
> long
> > >> ======================
> > >> at
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -1598728140.txt
> > >> Begin query: sample6.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/positive/sample6.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/sample6.q.out
> > >> Exception: Unresolved compilation problem:
> > >> The method
> > >> getEvalMethod(List<Class<?>>) of
> type
> > >> NumericOpMethodResolver must override a
> superclass
> > method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method
> > >> getEvalMethod(List<Class<?>>) of
> type
> > >> NumericOpMethodResolver must override a
> superclass
> > method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
> > >> pMethodResolver.java:52)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
> > >> .java:274)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> > >>
> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> > >>
> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSamplePredicate(SemanticA
> > >> nalyzer.java:2872)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genTablePlan(SemanticAnalyze
> > >> r.java:2985)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3027)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample6(TestParse.java:10
> > >> 44)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1450160017.txt
> > >> Begin query: sample7.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/positive/sample7.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/sample7.q.out
> > >> Exception: Unresolved compilation problem:
> > >> The method
> > >> getEvalMethod(List<Class<?>>) of
> type
> > >> NumericOpMethodResolver must override a
> superclass
> > method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method
> > >> getEvalMethod(List<Class<?>>) of
> type
> > >> NumericOpMethodResolver must override a
> superclass
> > method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
> > >> pMethodResolver.java:52)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
> > >> .java:274)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> > >>
> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> > >>
> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSamplePredicate(SemanticA
> > >> nalyzer.java:2872)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genTablePlan(SemanticAnalyze
> > >> r.java:2985)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3027)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample7(TestParse.java:10
> > >> 70)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 514371634.txt
> > >> Begin query: subq.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/positive/subq.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/subq.q.out
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> > >> er.java:904)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2712)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3000)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3021)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_subq(TestParse.java:1096)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 520907971.txt
> > >> Begin query: udf1.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/positive/udf1.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf1.q.out
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> > >> er.java:904)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2712)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf1(TestParse.java:1122)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 656206857.txt
> > >> Begin query: udf4.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/positive/udf4.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf4.q.out
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> > >> er.java:1167)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2724)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf4(TestParse.java:1148)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >>
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 545867528.txt
> > >> Begin query: udf6.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/positive/udf6.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf6.q.out
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> > >> er.java:1167)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2724)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf6(TestParse.java:1174)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -1947338661.txt
> > >> Begin query: union.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/positive/union.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/union.q.out
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> > >> er.java:904)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2712)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3000)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3003)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3021)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_union(TestParse.java:1200>>
> > )
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Table doesnotexist does not exist
> > >> Testing Filter Operator
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method
> > >> getEvalMethod(List<Class<?>>) of
> type
> > >> ComparisonOpMethodResolver must override a
> > superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.ComparisonOpMethodResolver.getEvalMethod(Compa
> > >> risonOpMethodResolver.java:54)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
> > >> .java:274)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> > >>
> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> > >>
> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.TestOperators.testBaseFilterOperator(TestOpera
> > >> tors.java:79)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Testing FileSink Operator
> > >> FileSink Operator ok
> > >> Testing Script Operator
> > >> [0] io.o=[1, 01]
> > >> [0]
> > >>
> >
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> > >> ctor@acb988
> > >> [1] io.o=[2, 11]
> > >> [1]
> > >>
> >
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> > >> ctor@acb988
> > >> [2] io.o=[3, 21]
> > >> [2]
> > >>
> >
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> > >> ctor@acb988
> > >> [3] io.o=[4, 31]
> > >> [3]
> > >>
> >
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> > >> ctor@acb988
> > >> [4] io.o=[5, 41]
> > >> [4]
> > >>
> >
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> > >> ctor@acb988
> > >> Script Operator ok
> > >> Testing Map Operator
> > >> io1.o.toString() = [[0, 1, 2]]
> > >> io2.o.toString() = [[0, 1, 2]]
> > >> answer.toString() = [[0, 1, 2]]
> > >> io1.o.toString() = [[1, 2, 3]]
> > >> io2.o.toString() = [[1, 2, 3]]
> > >> answer.toString() = [[1, 2, 3]]
> > >> io1.o.toString() = [[2, 3, 4]]
> > >> io2.o.toString() = [[2, 3, 4]]
> > >> answer.toString() = [[2, 3, 4]]
> > >> io1.o.toString() = [[3, 4, 5]]
> > >> io2.o.toString() = [[3, 4, 5]]
> > >> answer.toString() = [[3, 4, 5]]
> > >> io1.o.toString() = [[4, 5, 6]]
> > >> io2.o.toString() = [[4, 5, 6]]
> > >> answer.toString() = [[4, 5, 6]]
> > >> Map Operator ok
> > >> JEXL library test ok
> > >> Evaluating 1 + 2 for 10000000 times
> > >> Evaluation finished: 0.562 seconds, 0.056
> > seconds/million
> > >> call.
> > >> Evaluating
> __udf__concat.evaluate("1",
> > >> "2") for 1000000 times
> > >> Evaluation finished: 1.028 seconds, 1.028
> > seconds/million
> > >> call.
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1713747826.txt
> > >> java.io.FileNotFoundException: join1.q (No
> such
> > file or
> > >> directory)
> > >> at
> java.io.FileInputStream.open(Native
> > Method)
> > >> at
> > java.io.FileInputStream.<init>(Unknown
> > >> Source)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.addFile(QTestUtil.java:188)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.queryListRunner(QTestUtil.java:751)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.TestMTQueries.testMTQueries1(TestMTQueries.java:51)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> ExprNodeFuncEvaluator ok
> > >> ExprNodeColumnEvaluator ok
> > >> testExprNodeConversionEvaluator ok
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method
> > >> getEvalMethod(List<Class<?>>) of
> type
> > >> NumericOpMethodResolver must override a
> superclass
> > method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
> > >> pMethodResolver.java:52)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
> > >> .java:274)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> > >>
> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> > >>
> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.TestExpressionEvaluator.testExprNodeSpeed(Test
> > >> ExpressionEvaluator.java:168)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> input struct = [234, [firstString,
> secondString],
> > >> {firstKey=1, secondKey=2}, -234, 1.0, -2.5]
> > >> Testing protocol:
> > >>
> >
> org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
> > >> TypeName =
> > >>
> >
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> > >> ble,nd:double}
> > >> bytes
> > >>
> >
> =x01x80x00x00xeax01x80x00x00x02x01x66x69x72x73x74x53x74x72x69x6ex67x00x01x73x
> > >>
> >
> 65x63x6fx6ex64x53x74x72x69x6ex67x00x01x80x00x00x02x01x66x69x72x73x74x4bx65x79
> > >>
> >
> x00x01x80x00x00x01x01x73x65x63x6fx6ex64x4bx65x79x00x01x80x00x00x02x01x7fxffxf
> > >>
> >
> fx16x01xbfxf0x00x00x00x00x00x00x01x3fxfbxffxffxffxffxffxff
> > >> o class = class java.util.ArrayList
> > >> o size = 6
> > >> o[0] class = class java.lang.Integer
> > >> o[1] class = class java.util.ArrayList
> > >> o[2] class = class java.util.HashMap
> > >> o = [234, [firstString, secondString],
> > {firstKey=1,
> > >> secondKey=2}, -234, 1.0, -2.5]
> > >> Testing protocol:
> > >>
> >
> org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
> > >> TypeName =
> > >>
> >
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> > >> ble,nd:double}
> > >> bytes
> > >>
> >
> =xfex7fxffxffx15xfex7fxffxffxfdxfex99x96x8dx8cx8bxacx8bx8dx96x91x98xffxfex8cx
> > >>
> >
> 9ax9cx90x91x9bxacx8bx8dx96x91x98xffxfex7fxffxffxfdxfex99x96x8dx8cx8bxb4x9ax86
> > >>
> >
> xffxfex7fxffxffxfexfex8cx9ax9cx90x91x9bxb4x9ax86xffxfex7fxffxffxfdxfex80x00x0
> > >>
> >
> 0xe9xfex40x0fxffxffxffxffxffxffxfexc0x04x00x00x00x00x00x00
> > >> o class = class java.util.ArrayList
> > >> o size = 6
> > >> o[0] class = class java.lang.Integer
> > >> o[1] class = class java.util.ArrayList
> > >> o[2] class = class java.util.HashMap
> > >> o = [234, [firstString, secondString],
> > {firstKey=1,
> > >> secondKey=2}, -234, 1.0, -2.5]
> > >> Testing protocol:
> > >> com.facebook.thrift.protocol.TBinaryProtocol
> > >> TypeName =
> > >>
> >
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> > >> ble,nd:double}
> > >> bytes
> > >>
> >
> =x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x
> > >>
> >
> 74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08
> > >>
> >
> x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x6
> > >>
> >
> 5x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x
> > >>
> 00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
> > >> o class = class java.util.ArrayList
> > >> o size = 6
> > >> o[0] class = class java.lang.Integer
> > >> o[1] class = class java.util.ArrayList
> > >> o[2] class = class java.util.HashMap
> > >> o = [234, [firstString, secondString],
> > {firstKey=1,
> > >> secondKey=2}, -234, 1.0, -2.5]
> > >> Testing protocol:
> > >> com.facebook.thrift.protocol.TJSONProtocol
> > >> TypeName =
> > >>
> >
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> > >> ble,nd:double}
> > >> bytes
> > >>
> >
> =x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x
> > >>
> >
> 6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67
> > >>
> >
> x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx2
> > >>
> >
> 2x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x
> > >>
> >
> 73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2c
> > >>
> >
> x22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x6
> > >>
> >
> 4x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x
> > >> 7dx7d
> > >> bytes in text
> > >>
> >
> ={"-1":{"i32":234},"-2":{"lst":["str",2,"firstString","secondString"]},"-3":{
> > >>
> >
> "map":["str","i32",2,{"firstKey":1,"secondKey":2}]},"-4":{"i32":-234},"-5":{"
> > >>
> > dbl":1.0},"-6":{"dbl":-2.5}}
> > >> o class = class java.util.ArrayList
> > >> o size = 6
> > >> o[0] class = class java.lang.Integer
> > >> o[1] class = class java.util.ArrayList
> > >> o[2] class = class java.util.HashMap
> > >> o = [234, [firstString, secondString],
> > {firstKey=1,
> > >> secondKey=2}, -234, 1.0, -2.5]
> > >> Testing protocol:
> > >>
> >
> org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
> > >> TypeName =
> > >>
> >
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> > >> ble,nd:double}
> > >> bytes
> > >>
> >
> =x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x
> > >>
> >
> 69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32
> > >> x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
> > >> bytes in text
> > >>
> >
> =234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
> > >> o class = class java.util.ArrayList
> > >> o size = 6
> > >> o[0] class = class java.lang.Integer
> > >> o[1] class = class java.util.ArrayList
> > >> o[2] class = class java.util.HashMap
> > >> o = [234, [firstString, secondString],
> > {firstKey=1,
> > >> secondKey=2}, -234, 1.0, -2.5]
> > >> Beginning Test testTBinarySortableProtocol:
> > >> Testing struct test { double hello}
> > >> Testing struct test { i32 hello}
> > >> Testing struct test { i64 hello}
> > >> Testing struct test { string hello}
> > >> Testing struct test { string hello, double
> > another}
> > >> Test testTBinarySortableProtocol passed!
> > >> bytes in text =234
> firstStringsecondString
> > >> firstKey1secondKey2>
> > >> compare to =234
> firstStringsecondString
> > >> firstKey1secondKey2>
> > >> o class = class java.util.ArrayList
> > >> o size = 3
> > >> o[0] class = class java.lang.Integer
> > >> o[1] class = class java.util.ArrayList
> > >> o[2] class = class java.util.HashMap
> > >> o = [234, [firstString, secondString],
> > {firstKey=1,
> > >> secondKey=2}]
> > >> bytes in text =234
> firstStringsecondString
> > >> firstKey1secondKey2>
> > >> compare to =234
> firstStringsecondString
> > >> firstKey1secondKey2>
> > >> o class = class java.util.ArrayList
> > >> o size = 3
> > >> o = [234, null, {firstKey=1, secondKey=2}]
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 992344490.txt
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1962723908.txt
> > >> OK
> > >> OK
> > >> Copying data from
> > >> file:/home/ssarkar/hive/data/files/kv1.txt
> > >> Loading data to table testhivedrivertable
> > >> OK
> > >> OK
> > >> OK
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 247426390.txt
> > >> Begin query: altern1.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/altern1.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/altern1.q.out
> > >> Done query: altern1.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 587924093.txt
> > >> Begin query: bad_sample_clause.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/bad_sample_clause.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/bad_sample_clause.q.out
> > >> Done query: bad_sample_clause.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -1415770190.txt
> > >> Begin query: clusterbydistributeby.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/clusterbydistributeby.q.
> > >> out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/clusterbydistributeby.q
> > >> .out
> > >> Done query: clusterbydistributeby.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -1882308680.txt
> > >> Begin query: clusterbysortby.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/clusterbysortby.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/clusterbysortby.q.out
> > >> Done query: clusterbysortby.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1579535074.txt
> > >> Begin query: clustern1.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> > >> er.java:1167)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2724)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> > >> inSemanticAnalyzer.java:43)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
> > >> rn1(TestNegativeCliDriver.java:205)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -430224382.txt
> > >> Begin query: clustern2.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinReduceSinkChild(Seman
> > >> ticAnalyzer.java:2332)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperator(SemanticAnal
> > >> yzer.java:2380)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinPlan(SemanticAnalyzer
> > >> .java:2444)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3041)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> > >> inSemanticAnalyzer.java:43)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
> > >> rn2(TestNegativeCliDriver.java:230)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -431481701.txt
> > >> Begin query: clustern3.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> > >> er.java:1167)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2724)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> > >> inSemanticAnalyzer.java:43)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
> > >> rn3(TestNegativeCliDriver.java:255)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1179496399.txt
> > >> Begin query: clustern4.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> > >> er.java:1167)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2724)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> > >> inSemanticAnalyzer.java:43)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
> > >> rn4(TestNegativeCliDriver.java:280)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1998238474.txt
> > >> Begin query: describe_xpath1.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath1.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath1.q.out
> > >> Done query: describe_xpath1.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -93672182.txt
> > >> Begin query: describe_xpath2.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath2.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath2.q.out
> > >> Done query: describe_xpath2.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1401990633.txt
> > >> Begin query: describe_xpath3.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath3.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath3.q.out
> > >> Done query: describe_xpath3.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 659750364.txt
> > >> Begin query: describe_xpath4.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath4.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath4.q.out
> > >> Done query: describe_xpath4.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -778063141.txt
> > >> Begin query: fileformat_bad_class.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/fileformat_bad_class.q.o
> > >> ut
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/fileformat_bad_class.q.
> > >> out
> > >> Done query: fileformat_bad_class.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1389054449.txt
> > >> Begin query: fileformat_void_input.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> > >> er.java:904)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2712)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_filefo
> > >>
> rmat_void_input(TestNegativeCliDriver.java:430)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 893718016.txt
> > >> Begin query: fileformat_void_output.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/fileformat_void_output.q
> > >> .out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/fileformat_void_output.
> > >> q.out
> > >> Done query: fileformat_void_output.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -1795879737.txt
> > >> Begin query: input1.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/input1.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/input1.q.out
> > >> Done query: input1.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1786217678.txt
> > >> Begin query: input2.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> > >> er.java:1167)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2724)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input2
> > >> (TestNegativeCliDriver.java:505)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -1429356131.txt
> > >> Begin query: input_testxpath4.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> > >> er.java:1167)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2724)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> > >> inSemanticAnalyzer.java:43)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input_
> > >> testxpath4(TestNegativeCliDriver.java:530)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -299734685.txt
> > >> Begin query: invalid_create_tbl1.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_create_tbl1.q.ou>>
> > t
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_create_tbl1.q.o
> > >> ut
> > >> Done query: invalid_create_tbl1.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -3796110.txt
> > >> Begin query: invalid_create_tbl2.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_create_tbl2.q.ou>>
> > t
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_create_tbl2.q.o
> > >> ut
> > >> Done query: invalid_create_tbl2.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 732040395.txt
> > >> Begin query: invalid_select_expression.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_select_expressio
> > >> n.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_select_expressi
> > >> on.q.out
> > >> Done query: invalid_select_expression.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 764555300.txt
> > >> Begin query: invalid_tbl_name.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_tbl_name.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_tbl_name.q.out
> > >> Done query: invalid_tbl_name.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -1388068500.txt
> > >> Begin query: joinneg.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/joinneg.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/joinneg.q.out
> > >> Done query: joinneg.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1214860.txt
> > >> Begin query: load_wrong_fileformat.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/load_wrong_fileformat.q.
> > >> out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/load_wrong_fileformat.q
> > >> .out
> > >> Done query: load_wrong_fileformat.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -1542677940.txt
> > >> Begin query: notable_alias3.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> > >> er.java:904)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2712)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notabl
> > >> e_alias3(TestNegativeCliDriver.java:705)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -555682788.txt
> > >> Begin query: notable_alias4.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinReduceSinkChild(Seman
> > >> ticAnalyzer.java:2332)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperator(SemanticAnal
> > >> yzer.java:2380)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinPlan(SemanticAnalyzer
> > >> .java:2444)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3041)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> > >> inSemanticAnalyzer.java:43)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notabl
> > >> e_alias4(TestNegativeCliDriver.java:730)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -1604113442.txt
> > >> Begin query: strict_pruning.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.NumExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.NumExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$NumExprProcessor.process
> > >> (TypeCheckProcFactory.java:121)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genGroupByPlanReduceSinkOper
> > >> ator(SemanticAnalyzer.java:1688)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genGroupByPlan2MR(SemanticAn
> > >> alyzer.java:1892)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2721)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> > >> inSemanticAnalyzer.java:43)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_strict
> > >> _pruning(TestNegativeCliDriver.java:755)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -327058962.txt
> > >> Begin query: subq_insert.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/subq_insert.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/subq_insert.q.out
> > >> Done query: subq_insert.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -196827093.txt
> > >> Begin query: union.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/union.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/union.q.out
> > >> Done query: union.q
> > >>
> >
> =====================================================================
> > >>
> > >> Thanks,
> > >> Shyam
> > >>
> > >>
> > >>
> > >>
> > >> --- On Tue, 2/3/09, Prasad Chakka
> > >> <pr...@facebook.com> wrote:
> > >>
> > >>> From: Prasad Chakka
> > <pr...@facebook.com>
> > >>> Subject: Re: Eclipse run fails !!
> > >>> To:
> "hive-dev@hadoop.apache.org"
> > >> <hi...@hadoop.apache.org>,
> > >> "shyam_sarkar@yahoo.com"
> > >> <sh...@yahoo.com>
> > >>> Date: Tuesday, February 3, 2009, 2:51 PM
> > >>> I think there are multiple issues. Please
> do
> > the
> > >> following
> > >>>
> > >>>
> > >>> 1. 'ant clean' in hive
> directory
> > >>> 2. delete project in eclipse
> > >>> 3. Don't change any config values
> in
> > >> hive-site.xml
> > >>> (revert your changes to fs.default.name
> etc)
> > and
> > >> don't
> > >>> start HDFS cluster since in unit tests we
> are
> > working
> > >> on
> > >>> local file system.
> > >>> 4. check what java version is 1.6
> > >>> 5. Follow the steps in the hive eclipse
> > setup wiki
> > >> with
> > >>> -Dhadoop.version=0.17.2.1
> > >>> 6. Open Eclipse and import the project
> > >>> 7. Open project preferences and make
> sure
> > that it is
> > >>> using java 6. If it is not then change it
> to
> > use java6
> > >> (let
> > >>> me know if you need help here). If you
> change
> > it then
> > >> make
> > >>> sure that you rebuild the project by
> doing a
> > clean
> > >>> 8. Make sure that there are no
> compilation
> > problems
> > >> for
> > >>> the hive project (check
> 'problems' tab
> > in the
> > >> bottom
> > >>> panel of Eclipse)
> > >>> 9. Run the Junit test case. It should
> run
> > without
> > >> any
> > >>> warning dialogs
> > >>>
> > >>> Let me know which of these steps fail and
> what
> > is the
> > >>> error. You need not change any files run
> a
> > junit
> > >> testcase.
> > >>> Once you are at this point, we can help
> you in
> > setting
> > >> up
> > >>> command shell that talks to DFS.
> > >>>
> > >>> Prasad
> > >>>
> > >>> ________________________________
> > >>> From: Ashish Thusoo
> > <at...@facebook.com>
> > >>> Reply-To:
> <hi...@hadoop.apache.org>
> > >>> Date: Tue, 3 Feb 2009 14:41:12 -0800
> > >>> To: <sh...@yahoo.com>,
> > >>> <hi...@hadoop.apache.org>
> > >>> Subject: RE: Eclipse run fails !!
> > >>>
> > >>> Actually for running hive through eclipse
> you
> > >> don't
> > >>> need to download and start hadoop. Hive
> tests
> > >> automatically
> > >>> create a local instance of hdfs and
> map/reduce
> > and are
> > >> able
> > >>> to run it.
> > >>>
> > >>> The errors that you are getting seem to
> > indicate some
> > >> jpox
> > >>> plugins missing in eclipse. Prasad is an
> > expert in
> > >> that area
> > >>> and can perhaps comment on that...
> > >>>
> > >>> Ashish
> > >>>
> > >>> -----Original Message-----
> > >>> From: Shyam Sarkar
> > [mailto:shyam_sarkar@yahoo.com]
> > >>> Sent: Tuesday, February 03, 2009 2:30 PM
> > >>> To: hive-dev@hadoop.apache.org; Ashish
> Thusoo
> > >>> Subject: RE: Eclipse run fails !!
> > >>>
> > >>> Dear Ashish,
> > >>>
> > >>> I downloaded hadoop 0.17.0 and tried
> > bin/start-all.sh
> > >>> script. I got one error ::
> > >>>
> > >>
> >
> ==============================================================
> > >>> [ssarkar@ayush2 hadoop-0.17.0]$
> > bin/start-all.sh
> > >> starting
> > >>> namenode, logging to
> > >>>
> > >>
> >
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-namenode-ayush2
> > >> .out
> > >>> ssarkar@localhost's password:
> > >>> localhost: starting datanode, logging to
> > >>>
> > >>
> >
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-datanode-ayush2
> > >> .out
> > >>> ssarkar@localhost's password:
> > >>> localhost: starting secondarynamenode,
> logging
> > to
> > >>>
> > >>
> >
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-secondarynameno
> > >> de-ayush2.out
> > >>> localhost: Exception in thread
> > "main"
> > >>> java.lang.NullPointerException
> > >>> localhost: at
> > >>>
> > >>
> >
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:119)
> > >>> localhost: at
> > >>>
> > >>
> >
> org.apache.hadoop.dfs.SecondaryNameNode.<init>(SecondaryNameNode.java:118)
> > >>> localhost: at
> > >>>
> > >>
> >
> org.apache.hadoop.dfs.SecondaryNameNode.main(SecondaryNameNode.java:495)
> > >>> starting jobtracker, logging to
> > >>>
> > >>
> >
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-jobtracker-ayus
> > >> h2.out
> > >>> ssarkar@localhost's password:
> > >>> localhost: starting tasktracker, logging
> to
> > >>>
> > >>
> >
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-tasktracker-ayu
> > >> sh2.out
> > >>> [ssarkar@ayush2 hadoop-0.17.0]$
> > >>>
> > >>
> >
> ===================================================================
> > >>>
> > >>> Next I loaded hive project into eclipse
> > following
> > >> steps in
> > >>> hive wiki.
> > >>> I tried Run->Run
> Configurations->JUnit
> > and
> > >> selected
> > >>> TestTruncate to run but got the following
> > error ::
> > >>>
> > >>> "Errors exist in required
> Projest(s):
> > >>>
> > >>> hive
> > >>>
> > >>> Proceed with Launch ?"
> > >>>
> > >>> When I launch I got following errors ::
> > >>>
> > >>>
> > >>
> >
> =================================================================
> > >>> 09/02/03 14:01:33 INFO
> > metastore.HiveMetaStore: 0:
> > >> Opening
> > >>> raw store with implemenation
> > >>>
> > class:org.apache.hadoop.hive.metastore.ObjectStore
> > >>> 09/02/03 14:01:33 INFO
> metastore.ObjectStore:
> > >> ObjectStore,
> > >>> initialize called
> > >>> 09/02/03 14:01:33 INFO
> metastore.ObjectStore:
> > found
> > >>> resource jpox.properties at
> > >>>
> file:/home/ssarkar/hive/conf/jpox.properties
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>> "org.eclipse.ui.views" not
> > registered, but
> > >> plugin
> > >>> "org.eclipse.jdt.junit" defined
> in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > "org.eclipse.ui.perspectiveExtensions" not
> > >>> registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> "org.eclipse.ui.preferencePages" not
> > >> registered,
> > >>> but plugin
> "org.eclipse.jdt.junit"
> > defined
> > >> in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>> "org.eclipse.ui.keywords" not
> > registered,
> > >> but
> > >>> plugin "org.eclipse.jdt.junit"
> > defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > >>
> >
> "org.eclipse.debug.core.launchConfigurationTypes"
> > >>> not registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > >>
> >
> "org.eclipse.debug.core.launchConfigurationComparators"
> > >>> not registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > >>
> >
> "org.eclipse.debug.ui.launchConfigurationTypeImages"
> > >>> not registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > >>
> >
> "org.eclipse.debug.ui.launchConfigurationTabGroups"
> > >>> not registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>> "org.eclipse.ui.newWizards" not
> > registered,
> > >> but
> > >>> plugin "org.eclipse.jdt.junit"
> > defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>> "org.eclipse.ui.popupMenus" not
> > registered,
> > >> but
> > >>> plugin "org.eclipse.jdt.junit"
> > defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>> "org.eclipse.ui.actionSets" not
> > registered,
> > >> but
> > >>> plugin "org.eclipse.jdt.junit"
> > defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > "org.eclipse.ui.actionSetPartAssociations"
> > >> not
> > >>> registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > "org.eclipse.debug.ui.launchShortcuts" not
> > >>> registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > >>
> >
> "org.eclipse.jdt.core.classpathVariableInitializer"
> > >>> not registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > "org.eclipse.jdt.ui.quickFixProcessors" not
> > >>> registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > "org.eclipse.jdt.ui.classpathFixProcessors"
> > >> not
> > >>> registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > "org.eclipse.ui.ide.markerResolution" not
> > >>> registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > >>
> >
> "org.eclipse.core.expressions.propertyTesters" not
> > >>> registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > >>
> >
> "org.eclipse.ltk.core.refactoring.renameParticipants"
> > >>> not registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>> "org.eclipse.ui.commands" not
> > registered,
> > >> but
> > >>> plugin "org.eclipse.jdt.junit"
> > defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>> "org.eclipse.ui.bindings" not
> > registered,
> > >> but
> > >>> plugin "org.eclipse.jdt.junit"
> > defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > "org.eclipse.core.runtime.preferences" not
> > >>> registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > >>
> >
> "org.eclipse.jdt.core.classpathContainerInitializer"
> > >>> not registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > "org.eclipse.jdt.ui.classpathContainerPage"
> > >> not
> > >>> registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.ui.ide" but it
> cannot
> > be
> > >> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.ui.views" but it
> cannot
> > be
> > >> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.jface.text" but it
> > cannot be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>>
> > "org.eclipse.ui.workbench.texteditor" but it
> > >>> cannot be resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.ui.editors" but it
> > cannot be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.ui" but it cannot
> be
> > resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.core.expressions"
> but
> > it cannot
> > >> be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.core.resources"
> but it
> > cannot
> > >> be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.debug.core" but it
> > cannot be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.debug.ui" but it
> cannot
> > be
> > >> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.jdt.core" but it
> cannot
> > be
> > >> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.jdt.ui" but it
> cannot
> > be
> > >> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.core.runtime" but
> it
> > cannot be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.jdt.launching" but
> it
> > cannot be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.jdt.debug.ui" but
> it
> > cannot be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.compare" but it
> cannot
> > be
> > >> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>>
> "org.eclipse.ltk.core.refactoring"
> > but it
> > >> cannot
> > >>> be resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.core.variables"
> but it
> > cannot
> > >> be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>>
> "org.eclipse.ltk.ui.refactoring" but
> > it
> > >> cannot be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit.runtime"
> > requires
> > >>> "org.junit" but it cannot be
> > resolved.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Bundle
> > >>> "org.jpox" has an optional
> > dependency to
> > >>> "org.eclipse.equinox.registry"
> but
> > it cannot
> > >> be
> > >>> resolved
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Bundle
> > >>> "org.jpox" has an optional
> > dependency to
> > >>> "org.eclipse.core.runtime" but
> it
> > cannot be
> > >>> resolved
> > >>> 09/02/03 14:01:33 INFO JPOX.Persistence:
> > >> =================
> > >>> Persistence Configuration ===============
> > >>> 09/02/03 14:01:33 INFO JPOX.Persistence:
> JPOX
> > >> Persistence
> > >>> Factory - Vendor: "JPOX"
> Version:
> > >>> "1.2.2"
> > >>> 09/02/03 14:01:33 INFO JPOX.Persistence:
> JPOX
> > >> Persistence
> > >>> Factory initialised for datastore
> > >>>
> > >>
> >
> URL="jdbc:derby:;databaseName=../build/test/junit_metastore_db;create=true"
> > >>>
> > >>
> >
> driver="org.apache.derby.jdbc.EmbeddedDriver"
> > >>> userName="APP"
> > >>> 09/02/03 14:01:33 INFO JPOX.Persistence:
> > >>>
> > >>
> >
> ===========================================================
> > >>> 09/02/03 14:01:35 INFO Datastore.Schema:
> > Initialising
> > >>> Catalog "", Schema
> "APP"
> > using
> > >>> "SchemaTable" auto-start option
> > >>> 09/02/03 14:01:36 INFO Datastore.Schema:
> > Catalog
> > >>> "", Schema "APP"
> > initialised -
> > >> managing
> > >>> 0 classes
> > >>> 09/02/03 14:01:36 INFO JPOX.JDO: >>
> > Found
> > >>> StoreManager
> org.jpox.store.rdbms.RDBMSManager
> > >>> java.lang.UnsupportedClassVersionError:
> Bad
> > version
> > >> number
> > >>> in .class file
> > >>> at
> > java.lang.ClassLoader.defineClass1(Native
> > >>> Method)
> > >>> at
> > >>>
> > >>
> >
> java.lang.ClassLoader.defineClass(ClassLoader.java:620)
> > >>> at
> > >>>
> > >>
> >
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
> > >>> at
> > >>>
> > >>
> >
> java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
> > >>> at
> > >>>
> > >>
> >
> java.net.URLClassLoader.access$100(URLClassLoader.java:56)
> > >>> at
> > >>>
> > java.net.URLClassLoader$1.run(URLClassLoader.java:195)
> > >>> at
> > >>>
> > java.security.AccessController.doPrivileged(Native
> > >> Method)
> > >>> at
> > >>>
> > >>
> >
> java.net.URLClassLoader.findClass(URLClassLoader.java:188)
> > >>> at
> > >>>
> > java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > >>> at
> > >>>
> > >>
> >
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
> > >>> at
> > >>>
> > java.lang.ClassLoader.loadClass(ClassLoader.java:251)
> > >>> at
> > >>>
> > >>
> >
> java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:180)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStor
> > >> e.java:194)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:124)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:103)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:54)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:82)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore
> > >> .java:127)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(Hiv
> > >> eMetaStore.java:143)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.
> > >> java:115)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStor
> > >> e.java:100)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClie
> > >> nt.java:73)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:785)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:798)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:316)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:300)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:10
> > >> 5)
> > >>> at
> > >>>
> > >>
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > >>> Method)
> > >>> at
> > >>>
> > >>
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccess
> > >> orImpl.java:39)
> > >>> at
> > >>>
> > >>
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruct
> > >> orAccessorImpl.java:27)
> > >>> at
> > >>>
> > >>
> >
> java.lang.reflect.Constructor.newInstance(Constructor.java:494)
> > >>> at
> > >>>
> > >>
> >
> junit.framework.TestSuite.createTest(TestSuite.java:131)
> > >>> at
> > >>>
> > >>
> >
> junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
> > >>> at
> > >>>
> > >>
> >
> junit.framework.TestSuite.<init>(TestSuite.java:75)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3T
> > >> estLoader.java:102)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit
> > >> 3TestLoader.java:59)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:445)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >>> java.lang.ExceptionInInitializerError
> > >>> at
> > >>>
> > >>
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > >>> Method)
> > >>> at
> > >>>
> > >>
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccess
> > >> orImpl.java:39)
> > >>> at
> > >>>
> > >>
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruct
> > >> orAccessorImpl.java:27)
> > >>> at
> > >>>
> > >>
> >
> java.lang.reflect.Constructor.newInstance(Constructor.java:494)
> > >>> at
> > >>>
> > >>
> >
> junit.framework.TestSuite.createTest(TestSuite.java:131)
> > >>> at
> > >>>
> > >>
> >
> junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
> > >>> at
> > >>>
> > >>
> >
> junit.framework.TestSuite.<init>(TestSuite.java:75)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3T
> > >> estLoader.java:102)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit
> > >> 3TestLoader.java:59)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:445)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >>> Caused by: java.lang.RuntimeException:
> > Encountered
> > >>> throwable
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:11
> > >> 3)
> > >>> ... 13 more
> > >>>
> > >>
> >
> ======================================================================
> > >>>
> > >>> regards,
> > >>> Shyam
> > >>>
> > >>>
> > >>>
> > >>>
> > >>>
> > >>> --- On Tue, 2/3/09, Ashish Thusoo
> > >>> <at...@facebook.com> wrote:
> > >>>
> > >>>> From: Ashish Thusoo
> > <at...@facebook.com>
> > >>>> Subject: RE: Eclipse run fails !!
> > >>>> To: "Shyam Sarkar"
> > >>> <sh...@yahoo.com>,
> > >>>>
> "hive-dev@hadoop.apache.org"
> > >>> <hi...@hadoop.apache.org>
> > >>>> Date: Tuesday, February 3, 2009, 1:46
> PM
> > Hi
> > >> Shyam,
> > >>>>
> > >>>> I can certainly say that 0.17.0
> should
> > work with
> > >>> eclipse. I have been
> > >>>> doing that for a while.
> > >>>>
> > >>>> Maybe we can concentrate on fixing
> why you
> > are
> > >> not
> > >>> able to create a
> > >>>> table in hdfs. I am not sure why you
> could
> > not
> > >> create
> > >>> the
> > >>>> /user/hive/warehouse directory in
> 0.17.
> > Are you
> > >> saying
> > >>> that
> > >>>>
> > >>>> hadoop dfs -mkdir /user/facebook/hive
> > >>>>
> > >>>> does not work for you? Can you send
> out
> > the
> > >> output
> > >>> when you run this
> > >>>> command.
> > >>>>
> > >>>> Ashish
> > >>>>
> > >>>> PS: using
> > -Dhadoop.versoion="0.17.0"
> > >> for all
> > >>> the commands that are
> > >>>> given in the wiki should make things
> work
> > in
> > >> eclipse.
> > >>>>
> > >>>> -----Original Message-----
> > >>>> From: Shyam Sarkar
> > >> [mailto:shyam_sarkar@yahoo.com]
> > >>>> Sent: Tuesday, February 03, 2009
> 12:00 PM
> > >>>> To: hive-dev@hadoop.apache.org;
> Ashish
> > Thusoo
> > >>>> Subject: RE: Eclipse run fails !!
> > >>>>
> > >>>> Dear Ashish,
> > >>>>
> > >>>> For the last few days I tried eclipse
> > 3.4.1 with
> > >>> 0.17.2.1 version and
> > >>>> got the same errors with run->run.
> Then
> > I
> > >> looked
> > >>> into bin/hive command
> > >>>> and found that it could not create
> table
> > in HDFS.
> > >> The
> > >>> reason was that
> > >>>> I could not create
> /user/hive/warehouse
> > directory
> > >>> inside HDFS. It was
> > >>>> using Linux FS.
> > >>>> This is why I switched to 0.19.0
> where
> > >> directories in
> > >>> HDFS can be
> > >>>> created.
> > >>>>
> > >>>> Could you please tell me which exact
> > version of
> > >> hadoop
> > >>> will work fine
> > >>>> with eclipse runs ? I want to get
> rid of
> > errors
> > >> in
> > >>> project itself
> > >>>> (before any run).
> > >>>>
> > >>>> Regards,
> > >>>> Shyam
> > >>>>
> > >>>> --- On Tue, 2/3/09, Ashish Thusoo
> > >>>> <at...@facebook.com> wrote:
> > >>>>
> > >>>>> From: Ashish Thusoo
> > >> <at...@facebook.com>
> > >>>>> Subject: RE: Eclipse run fails !!
> > >>>>> To:
> > "hive-dev@hadoop.apache.org"
> > >>>> <hi...@hadoop.apache.org>,
> > >>>>>
> "shyam_sarkar@yahoo.com"
> > >>>> <sh...@yahoo.com>
> > >>>>> Date: Tuesday, February 3, 2009,
> 11:38
> > AM Hi
> > >>> Shyam,
> > >>>>>
> > >>>>> We have not really tried the
> eclipse
> > stuff
> > >> for
> > >>> 0.19.0.
> > >>>> Is it possible
> > >>>>> for you to use 0.17.0 for now,
> while
> > we
> > >> figure
> > >>> this
> > >>>> out...
> > >>>>>
> > >>>>> Ashish
> > >>>>>
> > >>>>> -----Original Message-----
> > >>>>> From: Shyam Sarkar
> > >>> [mailto:shyam_sarkar@yahoo.com]
> > >>>>> Sent: Tuesday, February 03, 2009
> 11:26
> > AM
> > >>>>> To: hive-dev@hadoop.apache.org
> > >>>>> Subject: Eclipse run fails !!
> > >>>>>
> > >>>>> Hello,
> > >>>>>
> > >>>>> I have hive project loaded inside
> > eclipse
> > >> 3.4.1
> > >>> and
> > >>>> hadoop 0.19.0 is
> > >>>>> running in the background. I
> could
> > create
> > >> tables
> > >>> from
> > >>>> bin/hive
> > >>>>> command.
> > >>>>> But when I try to run->run
> inside
> > eclipse
> > >> it
> > >>> says::
> > >>>>>
> > >>>>> "Errors exist with required
> > project(s):
> > >>>>>
> > >>>>> hive
> > >>>>>
> > >>>>> Proceed with launch ?"
> > >>>>>
> > >>>>> and then it gives many errors.
> > >>>>>
> > >>>>> Can someone please tell me why
> there
> > are
> > >> errors
> > >>> in
> > >>>> project hive ? I
> > >>>>> followed all steps correctly from
> hive
> > wiki.
> > >>>>>
> > >>>>> Regards,
> > >>>>> shyam_sarkar@yahoo.com
> > >
> > >
> > >
Re: Eclipse run fails !!
Posted by Shyam Sarkar <sh...@yahoo.com>.
Dear Raghu and others,
Please ignore my previous e-mail. When I did Build all, all errors were gone. However I am still getting following warnings ::
Reference to generic type should be parameterized -- it is happening 1707 times. Should it be corrected in future ?
============================================================
Description Resource Path Location Type
AbstractList is a raw type. References to generic type AbstractList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 361 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 131 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 135 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 139 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 143 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 143 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 234 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 306 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 307 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 370 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized JJTthrift_grammarState.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 13 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized JJTthrift_grammarState.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 14 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized OptionsProcessor.java hive/cli/src/java/org/apache/hadoop/hive/cli line 76 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized OptionsProcessor.java hive/cli/src/java/org/apache/hadoop/hive/cli line 76 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ScriptOperator.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 397 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ScriptOperator.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 397 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized Utilities.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 249 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized Utilities.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 250 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized Utilities.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 250 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized thrift_grammar.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 2283 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ByteStreamTypedSerDe.java hive/serde/src/java/org/apache/hadoop/hive/serde line 70 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ColumnInfo.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 56 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 158 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 186 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 193 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 200 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 377 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 389 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 404 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 412 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ConstantTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 65 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ConstantTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 70 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ConstantTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 74 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ConstantTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 78 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeStructBase.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 50 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeBase.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 44 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeBool.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 70 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeDouble.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 67 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeList.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 40 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeMap.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 46 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeMap.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 48 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeMap.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 49 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeSet.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 49 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeSet.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 51 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeString.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 45 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypei16.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 37 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypei32.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 65 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypei64.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 37 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized FetchTask.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 112 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized FetchTask.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 113 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized FetchTask.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 115 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized HiveInputFormat.java hive/ql/src/java/org/apache/hadoop/hive/ql/io line 148 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized HiveInputFormat.java hive/ql/src/java/org/apache/hadoop/hive/ql/io line 149 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized HiveInputFormat.java hive/ql/src/java/org/apache/hadoop/hive/ql/io line 151 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized HiveInputFormat.java hive/ql/src/java/org/apache/hadoop/hive/ql/io line 173 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized HiveInputFormat.java hive/ql/src/java/org/apache/hadoop/hive/ql/io line 209 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized JuteSerDe.java hive/serde/src/java/org/apache/hadoop/hive/serde/jute line 79 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized JuteSerDe.java hive/serde/src/java/org/apache/hadoop/hive/serde/jute line 96 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized MapOperator.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 96 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized MetadataTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 44 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized MetadataTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 93 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized MetadataTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 98 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized MetadataTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 102 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized MetadataTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 106 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized PrimitiveTypeInfo.java hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo line 39 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized PrimitiveTypeInfo.java hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo line 52 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized PrimitiveTypeInfo.java hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo line 58 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized PrimitiveTypeInfo.java hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo line 66 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized RandomDimension.java hive/ql/src/java/org/apache/hadoop/hive/ql/metadata line 32 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 30 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 31 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 36 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 37 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 41 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 81 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 83 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 84 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 119 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 123 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 131 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 139 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 68 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 73 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 78 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 83 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 33 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 33 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 35 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 43 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 85 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 122 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 123 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized TReflectionUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 27 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized TReflectionUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 32 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized TReflectionUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 45 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ThriftByteStreamTypedSerDe.java hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line 100 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ThriftByteStreamTypedSerDe.java hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line 124 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ThriftSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line 33 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized UDFIf.java hive/ql/src/java/org/apache/hadoop/hive/ql/udf line 67 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized Utilities.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 493 Java Problem
===================================================================
Thanks,
Shyam
--- On Wed, 2/4/09, Shyam Sarkar <sh...@yahoo.com> wrote:
> From: Shyam Sarkar <sh...@yahoo.com>
> Subject: Re: Eclipse run fails !!
> To: "hive-dev@hadoop.apache.org" <hi...@hadoop.apache.org>, "Raghu Murthy" <ra...@facebook.com>
> Date: Wednesday, February 4, 2009, 1:35 PM
> Dear Raghu and others,
>
> I removed @Override directives and error count came down to
> three. I am still getting following error with override
> issues ::
>
> Description Resource Path Location Type
> The method getEvalMethod(List<Class<?>>) of
> type ComparisonOpMethodResolver must override a superclass
> method ComparisonOpMethodResolver.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 54 Java Problem
> =============================See the following
> class==================
>
> public class ComparisonOpMethodResolver implements
> UDFMethodResolver {
>
> /**
> * The udfclass for which resolution is needed.
> */
> private Class<? extends UDF> udfClass;
>
> /**
> * Constuctor.
> */
> public ComparisonOpMethodResolver(Class<? extends
> UDF> udfClass) {
> this.udfClass = udfClass;
> }
>
>
> /* (non-Javadoc)
> * @see
> org.apache.hadoop.hive.ql.exec.UDFMethodResolver#getEvalMethod(java.util.List)
> */
> /* @Override */
> public Method getEvalMethod(List<Class<?>>
> argClasses)
> throws AmbiguousMethodException {
> assert(argClasses.size() == 2);
>
> List<Class<?>> pClasses = null;
> if (argClasses.get(0) == Void.class ||
> argClasses.get(1) == Void.class) {
> pClasses = new ArrayList<Class<?>>();
> pClasses.add(Double.class);
> pClasses.add(Double.class);
> }
> else if (argClasses.get(0) == argClasses.get(1)) {
> pClasses = argClasses;
> }
> else if (argClasses.get(0) == java.sql.Date.class ||
> argClasses.get(1) == java.sql.Date.class) {
> pClasses = new ArrayList<Class<?>>();
> pClasses.add(java.sql.Date.class);
> pClasses.add(java.sql.Date.class);
> }
> else {
> pClasses = new ArrayList<Class<?>>();
> pClasses.add(Double.class);
> pClasses.add(Double.class);
> }
>
> Method udfMethod = null;
>
> for(Method m: Arrays.asList(udfClass.getMethods())) {
> if (m.getName().equals("evaluate")) {
>
> Class<?>[] argumentTypeInfos =
> m.getParameterTypes();
>
> boolean match = (argumentTypeInfos.length ==
> pClasses.size());
>
> for(int i=0; i<pClasses.size() && match;
> i++) {
> Class<?> accepted =
> ObjectInspectorUtils.generalizePrimitive(argumentTypeInfos[i]);
> if (accepted != pClasses.get(i)) {
> match = false;
> }
> }
>
> if (match) {
> if (udfMethod != null) {
> throw new AmbiguousMethodException(udfClass,
> argClasses);
> }
> else {
> udfMethod = m;
> }
> }
> }
> }
> return udfMethod;
> }
>
> }
> =============================================================
>
> Other two errors are ::
>
> Description Resource Path Location Type
> The method getEvalMethod(List<Class<?>>) of
> type NumericOpMethodResolver must override a superclass
> method NumericOpMethodResolver.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 52 Java Problem
> The method getEvalMethod(List<Class<?>>) of
> type UDFIf.UDFIfMethodResolver must override a superclass
> method UDFIf.java hive/ql/src/java/org/apache/hadoop/hive/ql/udf line
> 81 Java Problem
>
> ======================================================
>
> Thanks,
> Shyam
>
>
>
> --- On Tue, 2/3/09, Raghu Murthy <ra...@facebook.com>
> wrote:
>
> > From: Raghu Murthy <ra...@facebook.com>
> > Subject: Re: Eclipse run fails !!
> > To: "hive-dev@hadoop.apache.org"
> <hi...@hadoop.apache.org>,
> "shyam_sarkar@yahoo.com"
> <sh...@yahoo.com>
> > Date: Tuesday, February 3, 2009, 5:35 PM
> > Shyam,
> >
> > Please go ahead and remove the @Override directives.
> This
> > is an issue with
> > eclipse. Versions prior to 3.4.1 actually add
> @Override to
> > classes which
> > implement interfaces. Version 3.4.1 correctly
> complains
> > that @Override
> > should not be present for methods from interfaces.
> >
> > Can you submit a jira so that we can fix this?
> >
> > raghu
> >
> >
> > On 2/3/09 5:28 PM, "Shyam Sarkar"
> > <sh...@yahoo.com> wrote:
> >
> > > Dear Prasad,
> > >
> > > I did a clean and then performed build all for
> project
> > hive. I am getting 10
> > > errors and 1706 warnings. All errors are about
> > "must override a superclass
> > > method". It seems to be compiler problem. I
> have
> > added jre1.6.0_11 in build
> > > JRE. Why is the following problem coming ?
> > >
> > > UDFMethodResolver is an interface::
> > >
> > > public interface UDFMethodResolver {
> > >
> > > public Method
> > getEvalMethod(List<Class<?>> argClasses)
> > > throws AmbiguousMethodException;
> > > }
> > >
> > > Following method should override above method ::
> > >
> > > public Method
> getEvalMethod(List<Class<?>>
> > argClasses)
> > > throws AmbiguousMethodException {
> > > assert(argClasses.size() == 2);
> > >
> > > List<Class<?>> pClasses = null;
> > > if (argClasses.get(0) == Void.class ||
> > > argClasses.get(1) == Void.class) {
> > > pClasses = new
> > ArrayList<Class<?>>();
> > > pClasses.add(Double.class);
> > > pClasses.add(Double.class);
> > > }
> > > else if (argClasses.get(0) ==
> argClasses.get(1)) {
> > > pClasses = argClasses;
> > > }
> > > else if (argClasses.get(0) ==
> java.sql.Date.class
> > ||
> > > argClasses.get(1) ==
> java.sql.Date.class)
> > {
> > > pClasses = new
> > ArrayList<Class<?>>();
> > > pClasses.add(java.sql.Date.class);
> > > pClasses.add(java.sql.Date.class);
> > > }
> > > else {
> > > pClasses = new
> > ArrayList<Class<?>>();
> > > pClasses.add(Double.class);
> > > pClasses.add(Double.class);
> > > }
> > >
> > > Method udfMethod = null;
> > >
> > > for(Method m:
> > Arrays.asList(udfClass.getMethods())) {
> > > if
> (m.getName().equals("evaluate")) {
> > >
> > > Class<?>[] argumentTypeInfos =
> > m.getParameterTypes();
> > >
> > > boolean match = (argumentTypeInfos.length
> ==
> > pClasses.size());
> > >
> > > for(int i=0; i<pClasses.size()
> &&
> > match; i++) {
> > > Class<?> accepted =
> > >
> >
> ObjectInspectorUtils.generalizePrimitive(argumentTypeInfos[i]);
> > > if (accepted != pClasses.get(i)) {
> > > match = false;
> > > }
> > > }
> > >
> > > if (match) {
> > > if (udfMethod != null) {
> > > throw new
> > AmbiguousMethodException(udfClass, argClasses);
> > > }
> > > else {
> > > udfMethod = m;
> > > }
> > > }
> > > }
> > > }
> > > return udfMethod;
> > > }
> > >
> > > }
> > >
> > >
> > >
> > > =====================Errors and
> > Warnings=======================
> > > Description Resource Path Location
>
> > Type
> > > The method add_partition(Partition) of type
> > MetaStoreClient must override a
> > > superclass method MetaStoreClient.java
> > >
> >
> hive/metastore/src/java/org/apache/hadoop/hive/metastore
> > line 466
> > > Java Problem
> > > The method
> getEvalMethod(List<Class<?>>)
> > of type ComparisonOpMethodResolver
> > > must override a superclass method
> > ComparisonOpMethodResolver.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 54 Java Problem
> > > The method
> getEvalMethod(List<Class<?>>)
> > of type NumericOpMethodResolver must
> > > override a superclass method
> > NumericOpMethodResolver.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 52 Java Problem
> > > The method
> getEvalMethod(List<Class<?>>)
> > of type UDFIf.UDFIfMethodResolver
> > > must override a superclass method UDFIf.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/udf
> line
> > 81 Java Problem
> > > The method process(Node, NodeProcessorCtx,
> Object...)
> > of type
> > > TypeCheckProcFactory.BoolExprProcessor must
> override a
> > superclass method
> > > TypeCheckProcFactory.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
>
> > line 205 Java
> > > Problem
> > > The method process(Node, NodeProcessorCtx,
> Object...)
> > of type
> > > TypeCheckProcFactory.ColumnExprProcessor must
> override
> > a superclass method
> > > TypeCheckProcFactory.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
>
> > line 245 Java
> > > Problem
> > > The method process(Node, NodeProcessorCtx,
> Object...)
> > of type
> > > TypeCheckProcFactory.DefaultExprProcessor must
> > override a superclass method
> > > TypeCheckProcFactory.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
>
> > line 584 Java
> > > Problem
> > > The method process(Node, NodeProcessorCtx,
> Object...)
> > of type
> > > TypeCheckProcFactory.NullExprProcessor must
> override a
> > superclass method
> > > TypeCheckProcFactory.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
>
> > line 94 Java Problem
> > > The method process(Node, NodeProcessorCtx,
> Object...)
> > of type
> > > TypeCheckProcFactory.NumExprProcessor must
> override a
> > superclass method
> > > TypeCheckProcFactory.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
>
> > line 121 Java
> > > Problem
> > > The method process(Node, NodeProcessorCtx,
> Object...)
> > of type
> > > TypeCheckProcFactory.StrExprProcessor must
> override a
> > superclass method
> > > TypeCheckProcFactory.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
>
> > line 163 Java
> > > Problem
> > > AbstractList is a raw type. References to generic
> type
> > AbstractList<E> should
> > > be parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 361 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 131 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 135 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 139 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 143 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 143 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 234 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 306 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 307 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 370 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized JJTthrift_grammarState.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 13 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized JJTthrift_grammarState.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 14 Java
> > > Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized OptionsProcessor.java
> > > hive/cli/src/java/org/apache/hadoop/hive/cli
> line
> > 76 Java Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized OptionsProcessor.java
> > > hive/cli/src/java/org/apache/hadoop/hive/cli
> line
> > 76 Java Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ScriptOperator.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 397 Java Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized ScriptOperator.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 397 Java Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized Utilities.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 249 Java Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized Utilities.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 250 Java Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized Utilities.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 250 Java Problem
> > > ArrayList is a raw type. References to generic
> type
> > ArrayList<E> should be
> > > parameterized thrift_grammar.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 2283
> > > Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ByteStreamTypedSerDe.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 70 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ColumnInfo.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 56 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 158 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 186 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 193 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 200 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 377 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 389 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 404 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ComplexSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 412 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ConstantTypedSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 65 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ConstantTypedSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 70 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ConstantTypedSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 74 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ConstantTypedSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 78 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeStructBase.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 50 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypeBase.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 44 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameerized DynamicSerDeTypeBool.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 70 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypeDouble.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 67 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypeList.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 40 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypeMap.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 46 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypeMap.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 48 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypeMap.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 49 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypeSet.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 49 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypeSet.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 51 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypeString.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 45 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypei16.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 37 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypei32.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 65 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized DynamicSerDeTypei64.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> > line 37 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized FetchTask.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 112 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized FetchTask.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 113 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized FetchTask.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 115 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized HiveInputFormat.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/io
> line
> > 148 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized HiveInputFormat.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/io
> line
> > 149 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized HiveInputFormat.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/io
> line
> > 151 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized HiveInputFormat.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/io
> line
> > 173 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized HiveInputFormat.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/io
> line
> > 209 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized JuteSerDe.java
> > >
> hive/serde/src/java/org/apache/hadoop/hive/serde/jute
> > line 79 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized JuteSerDe.java
> > >
> hive/serde/src/java/org/apache/hadoop/hive/serde/jute
> > line 96 Java Problem
> > > Class is a raw type. References to generic type
> > Cass<T> should be
> > > parameterized MapOperator.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 96 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized MetadataTypedSerDeField.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> > line 44 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized MetadataTypedSerDeField.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> > line 93 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized MetadataTypedSerDeField.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> > line 98 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized MetadataTypedSerDeField.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> > line 102
> > > Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized MetadataTypedSerDeField.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> > line 106
> > > Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized PrimitiveTypeInfo.java
> > >
> hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo
> > line 39 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized PrimitiveTypeInfo.java
> > >
> hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo
> > line 52 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized PrimitiveTypeInfo.java
> > >
> hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo
> > line 58 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized PrimitiveTypeInfo.java
> > >
> hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo
> > line 66 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized RandomDimension.java
> > >
> hive/ql/src/java/org/apache/hadoop/hive/ql/metadata
> > line 32 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 30 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 31 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 36 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 37 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 41 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 81 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 83 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 84 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 119 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 123 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 131 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ReflectionSerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 139 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > >parameterized SerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 68 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 73 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 78 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeField.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 83 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 33 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 33 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 35 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 43 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 85 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 122 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized SerDeUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 123 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized TReflectionUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 27 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized TReflectionUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 32 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized TReflectionUtils.java
> > > hive/serde/src/java/org/apache/hadoop/hive/serde
>
> > line 45 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized
> ThriftByteStreamTypedSerDe.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line
> > 100 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized
> ThriftByteStreamTypedSerDe.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line
> > 124 Java
> > > Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized ThriftSerDeField.java
> > >
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line
> > 33 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized UDFIf.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/udf
> line
> > 67 Java Problem
> > > Class is a raw type. References to generic type
> > Class<T> should be
> > > parameterized Utilities.java
> > > hive/ql/src/java/org/apache/hadoop/hive/ql/exec
> line
> > 493 Java Problem
> > >
> > >
> >
> =======================================================================
> > >
> > > Do you have any suggestion ?
> > >
> > > Thanks,
> > > Shyam
> > >
> > >
> > >
> > > --- On Tue, 2/3/09, Prasad Chakka
> > <pr...@facebook.com> wrote:
> > >
> > >> From: Prasad Chakka
> <pr...@facebook.com>
> > >> Subject: Re: Eclipse run fails !!
> > >> To: "shyam_sarkar@yahoo.com"
> > <sh...@yahoo.com>,
> > >> "hive-dev@hadoop.apache.org"
> > <hi...@hadoop.apache.org>
> > >> Date: Tuesday, February 3, 2009, 4:57 PM
> > >> There are compilation errors in Hive project
> so
> > that is why
> > >> running tests is causing issues. Could you
> send
> > what are the
> > >> compilation errors?
> > >> One of the errors should be on following
> line.
> > Itmost
> > >> probably a Eclipse and java issue. You can
> most
> > probably
> > >> remove the @override annotation and get
> successful
> > >> compilation. If there are any more errors
> send
> > them to us.
> > >>
> > >> The method
> > getEvalMethod(List<Class<?>>) of
> > >> type NumericOpMethodResolver must override a
> > superclass
> > >> method
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
> > >> pMethodResolver.java:52)
> > >>
> > >>
> > >> ________________________________
> > >> From: Shyam Sarkar
> <sh...@yahoo.com>
> > >> Reply-To: <sh...@yahoo.com>
> > >> Date: Tue, 3 Feb 2009 16:51:47 -0800
> > >> To: <hi...@hadoop.apache.org>,
> Prasad
> > Chakka
> > >> <pr...@facebook.com>
> > >> Subject: Re: Eclipse run fails !!
> > >>
> > >> Dear Prasad,
> > >>
> > >> I followed your instructions with 0.17.2.1
> hadoop
> > version
> > >> and changed jre to version 1.6_11. When I ran
> > JUnit test, I
> > >> still got the following message :
> > >>
> > >> "Errors exist in required Project(s):
> > >> hive
> > >> Proceed with Launch ?"
> > >>
> > >> When I launched I got following errors ::
> > >> =================================== It is
> long
> > >> ======================
> > >> at
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -1598728140.txt
> > >> Begin query: sample6.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/positive/sample6.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/sample6.q.out
> > >> Exception: Unresolved compilation problem:
> > >> The method
> > >> getEvalMethod(List<Class<?>>) of
> type
> > >> NumericOpMethodResolver must override a
> superclass
> > method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method
> > >> getEvalMethod(List<Class<?>>) of
> type
> > >> NumericOpMethodResolver must override a
> superclass
> > method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
> > >> pMethodResolver.java:52)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
> > >> .java:274)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> > >>
> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> > >>
> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSamplePredicate(SemanticA
> > >> nalyzer.java:2872)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genTablePlan(SemanticAnalyze
> > >> r.java:2985)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3027)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample6(TestParse.java:10
> > >> 44)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1450160017.txt
> > >> Begin query: sample7.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/positive/sample7.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/sample7.q.out
> > >> Exception: Unresolved compilation problem:
> > >> The method
> > >> getEvalMethod(List<Class<?>>) of
> type
> > >> NumericOpMethodResolver must override a
> superclass
> > method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method
> > >> getEvalMethod(List<Class<?>>) of
> type
> > >> NumericOpMethodResolver must override a
> superclass
> > method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
> > >> pMethodResolver.java:52)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
> > >> .java:274)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> > >>
> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> > >>
> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSamplePredicate(SemanticA
> > >> nalyzer.java:2872)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genTablePlan(SemanticAnalyze
> > >> r.java:2985)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3027)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample7(TestParse.java:10
> > >> 70)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 514371634.txt
> > >> Begin query: subq.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/positive/subq.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/subq.q.out
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> > >> er.java:904)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2712)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3000)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3021)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_subq(TestParse.java:1096)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 520907971.txt
> > >> Begin query: udf1.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/positive/udf1.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf1.q.out
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> > >> er.java:904)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2712)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf1(TestParse.java:1122)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 656206857.txt
> > >> Begin query: udf4.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/positive/udf4.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf4.q.out
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> > >> er.java:1167)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2724)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf4(TestParse.java:1148)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >>
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 545867528.txt
> > >> Begin query: udf6.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/positive/udf6.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf6.q.out
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> > >> er.java:1167)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2724)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf6(TestParse.java:1174)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -1947338661.txt
> > >> Begin query: union.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/positive/union.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/union.q.out
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> > >> er.java:904)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2712)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3000)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3003)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3021)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_union(TestParse.java:1200>>
> > )
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Table doesnotexist does not exist
> > >> Testing Filter Operator
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method
> > >> getEvalMethod(List<Class<?>>) of
> type
> > >> ComparisonOpMethodResolver must override a
> > superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.ComparisonOpMethodResolver.getEvalMethod(Compa
> > >> risonOpMethodResolver.java:54)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
> > >> .java:274)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> > >>
> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> > >>
> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.TestOperators.testBaseFilterOperator(TestOpera
> > >> tors.java:79)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Testing FileSink Operator
> > >> FileSink Operator ok
> > >> Testing Script Operator
> > >> [0] io.o=[1, 01]
> > >> [0]
> > >>
> >
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> > >> ctor@acb988
> > >> [1] io.o=[2, 11]
> > >> [1]
> > >>
> >
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> > >> ctor@acb988
> > >> [2] io.o=[3, 21]
> > >> [2]
> > >>
> >
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> > >> ctor@acb988
> > >> [3] io.o=[4, 31]
> > >> [3]
> > >>
> >
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> > >> ctor@acb988
> > >> [4] io.o=[5, 41]
> > >> [4]
> > >>
> >
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> > >> ctor@acb988
> > >> Script Operator ok
> > >> Testing Map Operator
> > >> io1.o.toString() = [[0, 1, 2]]
> > >> io2.o.toString() = [[0, 1, 2]]
> > >> answer.toString() = [[0, 1, 2]]
> > >> io1.o.toString() = [[1, 2, 3]]
> > >> io2.o.toString() = [[1, 2, 3]]
> > >> answer.toString() = [[1, 2, 3]]
> > >> io1.o.toString() = [[2, 3, 4]]
> > >> io2.o.toString() = [[2, 3, 4]]
> > >> answer.toString() = [[2, 3, 4]]
> > >> io1.o.toString() = [[3, 4, 5]]
> > >> io2.o.toString() = [[3, 4, 5]]
> > >> answer.toString() = [[3, 4, 5]]
> > >> io1.o.toString() = [[4, 5, 6]]
> > >> io2.o.toString() = [[4, 5, 6]]
> > >> answer.toString() = [[4, 5, 6]]
> > >> Map Operator ok
> > >> JEXL library test ok
> > >> Evaluating 1 + 2 for 10000000 times
> > >> Evaluation finished: 0.562 seconds, 0.056
> > seconds/million
> > >> call.
> > >> Evaluating
> __udf__concat.evaluate("1",
> > >> "2") for 1000000 times
> > >> Evaluation finished: 1.028 seconds, 1.028
> > seconds/million
> > >> call.
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1713747826.txt
> > >> java.io.FileNotFoundException: join1.q (No
> such
> > file or
> > >> directory)
> > >> at
> java.io.FileInputStream.open(Native
> > Method)
> > >> at
> > java.io.FileInputStream.<init>(Unknown
> > >> Source)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.addFile(QTestUtil.java:188)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.queryListRunner(QTestUtil.java:751)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.TestMTQueries.testMTQueries1(TestMTQueries.java:51)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> ExprNodeFuncEvaluator ok
> > >> ExprNodeColumnEvaluator ok
> > >> testExprNodeConversionEvaluator ok
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method
> > >> getEvalMethod(List<Class<?>>) of
> type
> > >> NumericOpMethodResolver must override a
> superclass
> > method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
> > >> pMethodResolver.java:52)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
> > >> .java:274)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> > >>
> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> > >>
> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.exec.TestExpressionEvaluator.testExprNodeSpeed(Test
> > >> ExpressionEvaluator.java:168)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> input struct = [234, [firstString,
> secondString],
> > >> {firstKey=1, secondKey=2}, -234, 1.0, -2.5]
> > >> Testing protocol:
> > >>
> >
> org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
> > >> TypeName =
> > >>
> >
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> > >> ble,nd:double}
> > >> bytes
> > >>
> >
> =x01x80x00x00xeax01x80x00x00x02x01x66x69x72x73x74x53x74x72x69x6ex67x00x01x73x
> > >>
> >
> 65x63x6fx6ex64x53x74x72x69x6ex67x00x01x80x00x00x02x01x66x69x72x73x74x4bx65x79
> > >>
> >
> x00x01x80x00x00x01x01x73x65x63x6fx6ex64x4bx65x79x00x01x80x00x00x02x01x7fxffxf
> > >>
> >
> fx16x01xbfxf0x00x00x00x00x00x00x01x3fxfbxffxffxffxffxffxff
> > >> o class = class java.util.ArrayList
> > >> o size = 6
> > >> o[0] class = class java.lang.Integer
> > >> o[1] class = class java.util.ArrayList
> > >> o[2] class = class java.util.HashMap
> > >> o = [234, [firstString, secondString],
> > {firstKey=1,
> > >> secondKey=2}, -234, 1.0, -2.5]
> > >> Testing protocol:
> > >>
> >
> org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
> > >> TypeName =
> > >>
> >
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> > >> ble,nd:double}
> > >> bytes
> > >>
> >
> =xfex7fxffxffx15xfex7fxffxffxfdxfex99x96x8dx8cx8bxacx8bx8dx96x91x98xffxfex8cx
> > >>
> >
> 9ax9cx90x91x9bxacx8bx8dx96x91x98xffxfex7fxffxffxfdxfex99x96x8dx8cx8bxb4x9ax86
> > >>
> >
> xffxfex7fxffxffxfexfex8cx9ax9cx90x91x9bxb4x9ax86xffxfex7fxffxffxfdxfex80x00x0
> > >>
> >
> 0xe9xfex40x0fxffxffxffxffxffxffxfexc0x04x00x00x00x00x00x00
> > >> o class = class java.util.ArrayList
> > >> o size = 6
> > >> o[0] class = class java.lang.Integer
> > >> o[1] class = class java.util.ArrayList
> > >> o[2] class = class java.util.HashMap
> > >> o = [234, [firstString, secondString],
> > {firstKey=1,
> > >> secondKey=2}, -234, 1.0, -2.5]
> > >> Testing protocol:
> > >> com.facebook.thrift.protocol.TBinaryProtocol
> > >> TypeName =
> > >>
> >
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> > >> ble,nd:double}
> > >> bytes
> > >>
> >
> =x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x
> > >>
> >
> 74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08
> > >>
> >
> x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x6
> > >>
> >
> 5x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x
> > >>
> 00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
> > >> o class = class java.util.ArrayList
> > >> o size = 6
> > >> o[0] class = class java.lang.Integer
> > >> o[1] class = class java.util.ArrayList
> > >> o[2] class = class java.util.HashMap
> > >> o = [234, [firstString, secondString],
> > {firstKey=1,
> > >> secondKey=2}, -234, 1.0, -2.5]
> > >> Testing protocol:
> > >> com.facebook.thrift.protocol.TJSONProtocol
> > >> TypeName =
> > >>
> >
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> > >> ble,nd:double}
> > >> bytes
> > >>
> >
> =x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x
> > >>
> >
> 6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67
> > >>
> >
> x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx2
> > >>
> >
> 2x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x
> > >>
> >
> 73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2c
> > >>
> >
> x22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x6
> > >>
> >
> 4x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x
> > >> 7dx7d
> > >> bytes in text
> > >>
> >
> ={"-1":{"i32":234},"-2":{"lst":["str",2,"firstString","secondString"]},"-3":{
> > >>
> >
> "map":["str","i32",2,{"firstKey":1,"secondKey":2}]},"-4":{"i32":-234},"-5":{"
> > >>
> > dbl":1.0},"-6":{"dbl":-2.5}}
> > >> o class = class java.util.ArrayList
> > >> o size = 6
> > >> o[0] class = class java.lang.Integer
> > >> o[1] class = class java.util.ArrayList
> > >> o[2] class = class java.util.HashMap
> > >> o = [234, [firstString, secondString],
> > {firstKey=1,
> > >> secondKey=2}, -234, 1.0, -2.5]
> > >> Testing protocol:
> > >>
> >
> org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
> > >> TypeName =
> > >>
> >
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> > >> ble,nd:double}
> > >> bytes
> > >>
> >
> =x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x
> > >>
> >
> 69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32
> > >> x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
> > >> bytes in text
> > >>
> >
> =234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
> > >> o class = class java.util.ArrayList
> > >> o size = 6
> > >> o[0] class = class java.lang.Integer
> > >> o[1] class = class java.util.ArrayList
> > >> o[2] class = class java.util.HashMap
> > >> o = [234, [firstString, secondString],
> > {firstKey=1,
> > >> secondKey=2}, -234, 1.0, -2.5]
> > >> Beginning Test testTBinarySortableProtocol:
> > >> Testing struct test { double hello}
> > >> Testing struct test { i32 hello}
> > >> Testing struct test { i64 hello}
> > >> Testing struct test { string hello}
> > >> Testing struct test { string hello, double
> > another}
> > >> Test testTBinarySortableProtocol passed!
> > >> bytes in text =234
> firstStringsecondString
> > >> firstKey1secondKey2>
> > >> compare to =234
> firstStringsecondString
> > >> firstKey1secondKey2>
> > >> o class = class java.util.ArrayList
> > >> o size = 3
> > >> o[0] class = class java.lang.Integer
> > >> o[1] class = class java.util.ArrayList
> > >> o[2] class = class java.util.HashMap
> > >> o = [234, [firstString, secondString],
> > {firstKey=1,
> > >> secondKey=2}]
> > >> bytes in text =234
> firstStringsecondString
> > >> firstKey1secondKey2>
> > >> compare to =234
> firstStringsecondString
> > >> firstKey1secondKey2>
> > >> o class = class java.util.ArrayList
> > >> o size = 3
> > >> o = [234, null, {firstKey=1, secondKey=2}]
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 992344490.txt
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1962723908.txt
> > >> OK
> > >> OK
> > >> Copying data from
> > >> file:/home/ssarkar/hive/data/files/kv1.txt
> > >> Loading data to table testhivedrivertable
> > >> OK
> > >> OK
> > >> OK
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 247426390.txt
> > >> Begin query: altern1.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/altern1.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/altern1.q.out
> > >> Done query: altern1.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 587924093.txt
> > >> Begin query: bad_sample_clause.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/bad_sample_clause.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/bad_sample_clause.q.out
> > >> Done query: bad_sample_clause.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -1415770190.txt
> > >> Begin query: clusterbydistributeby.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/clusterbydistributeby.q.
> > >> out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/clusterbydistributeby.q
> > >> .out
> > >> Done query: clusterbydistributeby.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -1882308680.txt
> > >> Begin query: clusterbysortby.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/clusterbysortby.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/clusterbysortby.q.out
> > >> Done query: clusterbysortby.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1579535074.txt
> > >> Begin query: clustern1.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> > >> er.java:1167)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2724)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> > >> inSemanticAnalyzer.java:43)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
> > >> rn1(TestNegativeCliDriver.java:205)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -430224382.txt
> > >> Begin query: clustern2.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinReduceSinkChild(Seman
> > >> ticAnalyzer.java:2332)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperator(SemanticAnal
> > >> yzer.java:2380)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinPlan(SemanticAnalyzer
> > >> .java:2444)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3041)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> > >> inSemanticAnalyzer.java:43)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
> > >> rn2(TestNegativeCliDriver.java:230)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -431481701.txt
> > >> Begin query: clustern3.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> > >> er.java:1167)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2724)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> > >> inSemanticAnalyzer.java:43)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
> > >> rn3(TestNegativeCliDriver.java:255)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1179496399.txt
> > >> Begin query: clustern4.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> > >> er.java:1167)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2724)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> > >> inSemanticAnalyzer.java:43)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
> > >> rn4(TestNegativeCliDriver.java:280)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1998238474.txt
> > >> Begin query: describe_xpath1.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath1.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath1.q.out
> > >> Done query: describe_xpath1.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -93672182.txt
> > >> Begin query: describe_xpath2.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath2.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath2.q.out
> > >> Done query: describe_xpath2.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1401990633.txt
> > >> Begin query: describe_xpath3.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath3.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath3.q.out
> > >> Done query: describe_xpath3.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 659750364.txt
> > >> Begin query: describe_xpath4.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath4.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath4.q.out
> > >> Done query: describe_xpath4.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -778063141.txt
> > >> Begin query: fileformat_bad_class.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/fileformat_bad_class.q.o
> > >> ut
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/fileformat_bad_class.q.
> > >> out
> > >> Done query: fileformat_bad_class.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1389054449.txt
> > >> Begin query: fileformat_void_input.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> > >> er.java:904)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2712)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_filefo
> > >>
> rmat_void_input(TestNegativeCliDriver.java:430)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 893718016.txt
> > >> Begin query: fileformat_void_output.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/fileformat_void_output.q
> > >> .out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/fileformat_void_output.
> > >> q.out
> > >> Done query: fileformat_void_output.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -1795879737.txt
> > >> Begin query: input1.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/input1.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/input1.q.out
> > >> Done query: input1.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1786217678.txt
> > >> Begin query: input2.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> > >> er.java:1167)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2724)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input2
> > >> (TestNegativeCliDriver.java:505)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -1429356131.txt
> > >> Begin query: input_testxpath4.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> > >> er.java:1167)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2724)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> > >> inSemanticAnalyzer.java:43)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input_
> > >> testxpath4(TestNegativeCliDriver.java:530)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -299734685.txt
> > >> Begin query: invalid_create_tbl1.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_create_tbl1.q.ou>>
> > t
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_create_tbl1.q.o
> > >> ut
> > >> Done query: invalid_create_tbl1.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -3796110.txt
> > >> Begin query: invalid_create_tbl2.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_create_tbl2.q.ou>>
> > t
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_create_tbl2.q.o
> > >> ut
> > >> Done query: invalid_create_tbl2.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 732040395.txt
> > >> Begin query: invalid_select_expression.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_select_expressio
> > >> n.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_select_expressi
> > >> on.q.out
> > >> Done query: invalid_select_expression.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 764555300.txt
> > >> Begin query: invalid_tbl_name.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_tbl_name.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_tbl_name.q.out
> > >> Done query: invalid_tbl_name.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -1388068500.txt
> > >> Begin query: joinneg.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/joinneg.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/joinneg.q.out
> > >> Done query: joinneg.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> 1214860.txt
> > >> Begin query: load_wrong_fileformat.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/load_wrong_fileformat.q.
> > >> out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/load_wrong_fileformat.q
> > >> .out
> > >> Done query: load_wrong_fileformat.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -1542677940.txt
> > >> Begin query: notable_alias3.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> > >> er.java:904)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2712)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notabl
> > >> e_alias3(TestNegativeCliDriver.java:705)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -555682788.txt
> > >> Begin query: notable_alias4.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.StrExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> > >> (TypeCheckProcFactory.java:163)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinReduceSinkChild(Seman
> > >> ticAnalyzer.java:2332)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperator(SemanticAnal
> > >> yzer.java:2380)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinPlan(SemanticAnalyzer
> > >> .java:2444)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3041)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> > >> inSemanticAnalyzer.java:43)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notabl
> > >> e_alias4(TestNegativeCliDriver.java:730)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -1604113442.txt
> > >> Begin query: strict_pruning.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> Exception: Unresolved compilation problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.NumExprProcessor
> > >> must override a superclass method
> > >>
> > >> java.lang.Error: Unresolved compilation
> problem:
> > >> The method process(Node,
> NodeProcessorCtx,
> > >> Object...) of type
> > TypeCheckProcFactory.NumExprProcessor
> > >> must override a superclass method
> > >>
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$NumExprProcessor.process
> > >> (TypeCheckProcFactory.java:121)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> > >> tcher.java:80)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> > >> java:83)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> > >> :113)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> > >> ker.java:95)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> > >> yzer.java:3311)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genGroupByPlanReduceSinkOper
> > >> ator(SemanticAnalyzer.java:1688)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genGroupByPlan2MR(SemanticAn
> > >> alyzer.java:1892)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> > >> .java:2721)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> > >> a:3048)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> > >> yzer.java:3229)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> > >> inSemanticAnalyzer.java:43)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> > >> yzer.java:71)
> > >> at
> > >>
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> > >> at
> > >>
> >
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_strict
> > >> _pruning(TestNegativeCliDriver.java:755)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> at
> > >>
> > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> > >> at
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> > >> Source)
> > >> at
> java.lang.reflect.Method.invoke(Unknown
> > Source)
> > >> at
> > >>
> > junit.framework.TestCase.runTest(TestCase.java:154)
> > >> at
> > >>
> > junit.framework.TestCase.runBare(TestCase.java:127)
> > >> at
> > >>
> >
> junit.framework.TestResult$1.protect(TestResult.java:106)
> > >> at
> > >>
> >
> junit.framework.TestResult.runProtected(TestResult.java:124)
> > >> at
> > >>
> > junit.framework.TestResult.run(TestResult.java:109)
> > >> at
> > junit.framework.TestCase.run(TestCase.java:118)
> > >> at
> > >>
> > junit.framework.TestSuite.runTest(TestSuite.java:208)
> > >> at
> > >>
> junit.framework.TestSuite.run(TestSuite.java:203)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> > >> stReference.java:130)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> > )
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:460)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >> at
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -327058962.txt
> > >> Begin query: subq_insert.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/subq_insert.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/subq_insert.q.out
> > >> Done query: subq_insert.q
> > >> Hive history
> > >>
> >
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> > >> -196827093.txt
> > >> Begin query: union.q
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-08,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=11}
> > >> OK
> > >> Loading data to table srcpart partition
> > {ds=2008-04-09,
> > >> hr=12}
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table srcbucket
> > >> OK
> > >> Loading data to table src
> > >> OK
> > >> diff -I
> \(file:\)\|\(/tmp/.*\)
> > >>
> >
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/union.q.out
> > >>
> >
> /home/ssarkar/hive/ql/src/test/results/clientnegative/union.q.out
> > >> Done query: union.q
> > >>
> >
> =====================================================================
> > >>
> > >> Thanks,
> > >> Shyam
> > >>
> > >>
> > >>
> > >>
> > >> --- On Tue, 2/3/09, Prasad Chakka
> > >> <pr...@facebook.com> wrote:
> > >>
> > >>> From: Prasad Chakka
> > <pr...@facebook.com>
> > >>> Subject: Re: Eclipse run fails !!
> > >>> To:
> "hive-dev@hadoop.apache.org"
> > >> <hi...@hadoop.apache.org>,
> > >> "shyam_sarkar@yahoo.com"
> > >> <sh...@yahoo.com>
> > >>> Date: Tuesday, February 3, 2009, 2:51 PM
> > >>> I think there are multiple issues. Please
> do
> > the
> > >> following
> > >>>
> > >>>
> > >>> 1. 'ant clean' in hive
> directory
> > >>> 2. delete project in eclipse
> > >>> 3. Don't change any config values
> in
> > >> hive-site.xml
> > >>> (revert your changes to fs.default.name
> etc)
> > and
> > >> don't
> > >>> start HDFS cluster since in unit tests we
> are
> > working
> > >> on
> > >>> local file system.
> > >>> 4. check what java version is 1.6
> > >>> 5. Follow the steps in the hive eclipse
> > setup wiki
> > >> with
> > >>> -Dhadoop.version=0.17.2.1
> > >>> 6. Open Eclipse and import the project
> > >>> 7. Open project preferences and make
> sure
> > that it is
> > >>> using java 6. If it is not then change it
> to
> > use java6
> > >> (let
> > >>> me know if you need help here). If you
> change
> > it then
> > >> make
> > >>> sure that you rebuild the project by
> doing a
> > clean
> > >>> 8. Make sure that there are no
> compilation
> > problems
> > >> for
> > >>> the hive project (check
> 'problems' tab
> > in the
> > >> bottom
> > >>> panel of Eclipse)
> > >>> 9. Run the Junit test case. It should
> run
> > without
> > >> any
> > >>> warning dialogs
> > >>>
> > >>> Let me know which of these steps fail and
> what
> > is the
> > >>> error. You need not change any files run
> a
> > junit
> > >> testcase.
> > >>> Once you are at this point, we can help
> you in
> > setting
> > >> up
> > >>> command shell that talks to DFS.
> > >>>
> > >>> Prasad
> > >>>
> > >>> ________________________________
> > >>> From: Ashish Thusoo
> > <at...@facebook.com>
> > >>> Reply-To:
> <hi...@hadoop.apache.org>
> > >>> Date: Tue, 3 Feb 2009 14:41:12 -0800
> > >>> To: <sh...@yahoo.com>,
> > >>> <hi...@hadoop.apache.org>
> > >>> Subject: RE: Eclipse run fails !!
> > >>>
> > >>> Actually for running hive through eclipse
> you
> > >> don't
> > >>> need to download and start hadoop. Hive
> tests
> > >> automatically
> > >>> create a local instance of hdfs and
> map/reduce
> > and are
> > >> able
> > >>> to run it.
> > >>>
> > >>> The errors that you are getting seem to
> > indicate some
> > >> jpox
> > >>> plugins missing in eclipse. Prasad is an
> > expert in
> > >> that area
> > >>> and can perhaps comment on that...
> > >>>
> > >>> Ashish
> > >>>
> > >>> -----Original Message-----
> > >>> From: Shyam Sarkar
> > [mailto:shyam_sarkar@yahoo.com]
> > >>> Sent: Tuesday, February 03, 2009 2:30 PM
> > >>> To: hive-dev@hadoop.apache.org; Ashish
> Thusoo
> > >>> Subject: RE: Eclipse run fails !!
> > >>>
> > >>> Dear Ashish,
> > >>>
> > >>> I downloaded hadoop 0.17.0 and tried
> > bin/start-all.sh
> > >>> script. I got one error ::
> > >>>
> > >>
> >
> ==============================================================
> > >>> [ssarkar@ayush2 hadoop-0.17.0]$
> > bin/start-all.sh
> > >> starting
> > >>> namenode, logging to
> > >>>
> > >>
> >
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-namenode-ayush2
> > >> .out
> > >>> ssarkar@localhost's password:
> > >>> localhost: starting datanode, logging to
> > >>>
> > >>
> >
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-datanode-ayush2
> > >> .out
> > >>> ssarkar@localhost's password:
> > >>> localhost: starting secondarynamenode,
> logging
> > to
> > >>>
> > >>
> >
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-secondarynameno
> > >> de-ayush2.out
> > >>> localhost: Exception in thread
> > "main"
> > >>> java.lang.NullPointerException
> > >>> localhost: at
> > >>>
> > >>
> >
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:119)
> > >>> localhost: at
> > >>>
> > >>
> >
> org.apache.hadoop.dfs.SecondaryNameNode.<init>(SecondaryNameNode.java:118)
> > >>> localhost: at
> > >>>
> > >>
> >
> org.apache.hadoop.dfs.SecondaryNameNode.main(SecondaryNameNode.java:495)
> > >>> starting jobtracker, logging to
> > >>>
> > >>
> >
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-jobtracker-ayus
> > >> h2.out
> > >>> ssarkar@localhost's password:
> > >>> localhost: starting tasktracker, logging
> to
> > >>>
> > >>
> >
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-tasktracker-ayu
> > >> sh2.out
> > >>> [ssarkar@ayush2 hadoop-0.17.0]$
> > >>>
> > >>
> >
> ===================================================================
> > >>>
> > >>> Next I loaded hive project into eclipse
> > following
> > >> steps in
> > >>> hive wiki.
> > >>> I tried Run->Run
> Configurations->JUnit
> > and
> > >> selected
> > >>> TestTruncate to run but got the following
> > error ::
> > >>>
> > >>> "Errors exist in required
> Projest(s):
> > >>>
> > >>> hive
> > >>>
> > >>> Proceed with Launch ?"
> > >>>
> > >>> When I launch I got following errors ::
> > >>>
> > >>>
> > >>
> >
> =================================================================
> > >>> 09/02/03 14:01:33 INFO
> > metastore.HiveMetaStore: 0:
> > >> Opening
> > >>> raw store with implemenation
> > >>>
> > class:org.apache.hadoop.hive.metastore.ObjectStore
> > >>> 09/02/03 14:01:33 INFO
> metastore.ObjectStore:
> > >> ObjectStore,
> > >>> initialize called
> > >>> 09/02/03 14:01:33 INFO
> metastore.ObjectStore:
> > found
> > >>> resource jpox.properties at
> > >>>
> file:/home/ssarkar/hive/conf/jpox.properties
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>> "org.eclipse.ui.views" not
> > registered, but
> > >> plugin
> > >>> "org.eclipse.jdt.junit" defined
> in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > "org.eclipse.ui.perspectiveExtensions" not
> > >>> registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> "org.eclipse.ui.preferencePages" not
> > >> registered,
> > >>> but plugin
> "org.eclipse.jdt.junit"
> > defined
> > >> in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>> "org.eclipse.ui.keywords" not
> > registered,
> > >> but
> > >>> plugin "org.eclipse.jdt.junit"
> > defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > >>
> >
> "org.eclipse.debug.core.launchConfigurationTypes"
> > >>> not registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > >>
> >
> "org.eclipse.debug.core.launchConfigurationComparators"
> > >>> not registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > >>
> >
> "org.eclipse.debug.ui.launchConfigurationTypeImages"
> > >>> not registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > >>
> >
> "org.eclipse.debug.ui.launchConfigurationTabGroups"
> > >>> not registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>> "org.eclipse.ui.newWizards" not
> > registered,
> > >> but
> > >>> plugin "org.eclipse.jdt.junit"
> > defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>> "org.eclipse.ui.popupMenus" not
> > registered,
> > >> but
> > >>> plugin "org.eclipse.jdt.junit"
> > defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>> "org.eclipse.ui.actionSets" not
> > registered,
> > >> but
> > >>> plugin "org.eclipse.jdt.junit"
> > defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > "org.eclipse.ui.actionSetPartAssociations"
> > >> not
> > >>> registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > "org.eclipse.debug.ui.launchShortcuts" not
> > >>> registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > >>
> >
> "org.eclipse.jdt.core.classpathVariableInitializer"
> > >>> not registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > "org.eclipse.jdt.ui.quickFixProcessors" not
> > >>> registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > "org.eclipse.jdt.ui.classpathFixProcessors"
> > >> not
> > >>> registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > "org.eclipse.ui.ide.markerResolution" not
> > >>> registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > >>
> >
> "org.eclipse.core.expressions.propertyTesters" not
> > >>> registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > >>
> >
> "org.eclipse.ltk.core.refactoring.renameParticipants"
> > >>> not registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>> "org.eclipse.ui.commands" not
> > registered,
> > >> but
> > >>> plugin "org.eclipse.jdt.junit"
> > defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>> "org.eclipse.ui.bindings" not
> > registered,
> > >> but
> > >>> plugin "org.eclipse.jdt.junit"
> > defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > "org.eclipse.core.runtime.preferences" not
> > >>> registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > >>
> >
> "org.eclipse.jdt.core.classpathContainerInitializer"
> > >>> not registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Extension
> > Point
> > >>>
> > "org.eclipse.jdt.ui.classpathContainerPage"
> > >> not
> > >>> registered, but plugin
> > >> "org.eclipse.jdt.junit"
> > >>> defined in
> > >>>
> > >>
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> > >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > >>> refers to it.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.ui.ide" but it
> cannot
> > be
> > >> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.ui.views" but it
> cannot
> > be
> > >> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.jface.text" but it
> > cannot be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>>
> > "org.eclipse.ui.workbench.texteditor" but it
> > >>> cannot be resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.ui.editors" but it
> > cannot be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.ui" but it cannot
> be
> > resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.core.expressions"
> but
> > it cannot
> > >> be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.core.resources"
> but it
> > cannot
> > >> be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.debug.core" but it
> > cannot be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.debug.ui" but it
> cannot
> > be
> > >> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.jdt.core" but it
> cannot
> > be
> > >> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.jdt.ui" but it
> cannot
> > be
> > >> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.core.runtime" but
> it
> > cannot be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.jdt.launching" but
> it
> > cannot be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.jdt.debug.ui" but
> it
> > cannot be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.compare" but it
> cannot
> > be
> > >> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>>
> "org.eclipse.ltk.core.refactoring"
> > but it
> > >> cannot
> > >>> be resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>> "org.eclipse.core.variables"
> but it
> > cannot
> > >> be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit"
> requires
> > >>>
> "org.eclipse.ltk.ui.refactoring" but
> > it
> > >> cannot be
> > >>> resolved.
> > >>> 09/02/03 14:01:33 ERROR JPOX.Plugin:
> Bundle
> > >>> "org.eclipse.jdt.junit.runtime"
> > requires
> > >>> "org.junit" but it cannot be
> > resolved.
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Bundle
> > >>> "org.jpox" has an optional
> > dependency to
> > >>> "org.eclipse.equinox.registry"
> but
> > it cannot
> > >> be
> > >>> resolved
> > >>> 09/02/03 14:01:33 WARN JPOX.Plugin:
> Bundle
> > >>> "org.jpox" has an optional
> > dependency to
> > >>> "org.eclipse.core.runtime" but
> it
> > cannot be
> > >>> resolved
> > >>> 09/02/03 14:01:33 INFO JPOX.Persistence:
> > >> =================
> > >>> Persistence Configuration ===============
> > >>> 09/02/03 14:01:33 INFO JPOX.Persistence:
> JPOX
> > >> Persistence
> > >>> Factory - Vendor: "JPOX"
> Version:
> > >>> "1.2.2"
> > >>> 09/02/03 14:01:33 INFO JPOX.Persistence:
> JPOX
> > >> Persistence
> > >>> Factory initialised for datastore
> > >>>
> > >>
> >
> URL="jdbc:derby:;databaseName=../build/test/junit_metastore_db;create=true"
> > >>>
> > >>
> >
> driver="org.apache.derby.jdbc.EmbeddedDriver"
> > >>> userName="APP"
> > >>> 09/02/03 14:01:33 INFO JPOX.Persistence:
> > >>>
> > >>
> >
> ===========================================================
> > >>> 09/02/03 14:01:35 INFO Datastore.Schema:
> > Initialising
> > >>> Catalog "", Schema
> "APP"
> > using
> > >>> "SchemaTable" auto-start option
> > >>> 09/02/03 14:01:36 INFO Datastore.Schema:
> > Catalog
> > >>> "", Schema "APP"
> > initialised -
> > >> managing
> > >>> 0 classes
> > >>> 09/02/03 14:01:36 INFO JPOX.JDO: >>
> > Found
> > >>> StoreManager
> org.jpox.store.rdbms.RDBMSManager
> > >>> java.lang.UnsupportedClassVersionError:
> Bad
> > version
> > >> number
> > >>> in .class file
> > >>> at
> > java.lang.ClassLoader.defineClass1(Native
> > >>> Method)
> > >>> at
> > >>>
> > >>
> >
> java.lang.ClassLoader.defineClass(ClassLoader.java:620)
> > >>> at
> > >>>
> > >>
> >
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
> > >>> at
> > >>>
> > >>
> >
> java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
> > >>> at
> > >>>
> > >>
> >
> java.net.URLClassLoader.access$100(URLClassLoader.java:56)
> > >>> at
> > >>>
> > java.net.URLClassLoader$1.run(URLClassLoader.java:195)
> > >>> at
> > >>>
> > java.security.AccessController.doPrivileged(Native
> > >> Method)
> > >>> at
> > >>>
> > >>
> >
> java.net.URLClassLoader.findClass(URLClassLoader.java:188)
> > >>> at
> > >>>
> > java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > >>> at
> > >>>
> > >>
> >
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
> > >>> at
> > >>>
> > java.lang.ClassLoader.loadClass(ClassLoader.java:251)
> > >>> at
> > >>>
> > >>
> >
> java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:180)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStor
> > >> e.java:194)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:124)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:103)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:54)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:82)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore
> > >> .java:127)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(Hiv
> > >> eMetaStore.java:143)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.
> > >> java:115)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStor
> > >> e.java:100)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClie
> > >> nt.java:73)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:785)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:798)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:316)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:300)
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:10
> > >> 5)
> > >>> at
> > >>>
> > >>
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > >>> Method)
> > >>> at
> > >>>
> > >>
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccess
> > >> orImpl.java:39)
> > >>> at
> > >>>
> > >>
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruct
> > >> orAccessorImpl.java:27)
> > >>> at
> > >>>
> > >>
> >
> java.lang.reflect.Constructor.newInstance(Constructor.java:494)
> > >>> at
> > >>>
> > >>
> >
> junit.framework.TestSuite.createTest(TestSuite.java:131)
> > >>> at
> > >>>
> > >>
> >
> junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
> > >>> at
> > >>>
> > >>
> >
> junit.framework.TestSuite.<init>(TestSuite.java:75)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3T
> > >> estLoader.java:102)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit
> > >> 3TestLoader.java:59)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:445)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >>> java.lang.ExceptionInInitializerError
> > >>> at
> > >>>
> > >>
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > >>> Method)
> > >>> at
> > >>>
> > >>
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccess
> > >> orImpl.java:39)
> > >>> at
> > >>>
> > >>
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruct
> > >> orAccessorImpl.java:27)
> > >>> at
> > >>>
> > >>
> >
> java.lang.reflect.Constructor.newInstance(Constructor.java:494)
> > >>> at
> > >>>
> > >>
> >
> junit.framework.TestSuite.createTest(TestSuite.java:131)
> > >>> at
> > >>>
> > >>
> >
> junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
> > >>> at
> > >>>
> > >>
> >
> junit.framework.TestSuite.<init>(TestSuite.java:75)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3T
> > >> estLoader.java:102)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit
> > >> 3TestLoader.java:59)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:445)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> > >> ner.java:673)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> > >> ava:386)
> > >>> at
> > >>>
> > >>
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> > >> java:196)
> > >>> Caused by: java.lang.RuntimeException:
> > Encountered
> > >>> throwable
> > >>> at
> > >>>
> > >>
> >
> org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:11
> > >> 3)
> > >>> ... 13 more
> > >>>
> > >>
> >
> ======================================================================
> > >>>
> > >>> regards,
> > >>> Shyam
> > >>>
> > >>>
> > >>>
> > >>>
> > >>>
> > >>> --- On Tue, 2/3/09, Ashish Thusoo
> > >>> <at...@facebook.com> wrote:
> > >>>
> > >>>> From: Ashish Thusoo
> > <at...@facebook.com>
> > >>>> Subject: RE: Eclipse run fails !!
> > >>>> To: "Shyam Sarkar"
> > >>> <sh...@yahoo.com>,
> > >>>>
> "hive-dev@hadoop.apache.org"
> > >>> <hi...@hadoop.apache.org>
> > >>>> Date: Tuesday, February 3, 2009, 1:46
> PM
> > Hi
> > >> Shyam,
> > >>>>
> > >>>> I can certainly say that 0.17.0
> should
> > work with
> > >>> eclipse. I have been
> > >>>> doing that for a while.
> > >>>>
> > >>>> Maybe we can concentrate on fixing
> why you
> > are
> > >> not
> > >>> able to create a
> > >>>> table in hdfs. I am not sure why you
> could
> > not
> > >> create
> > >>> the
> > >>>> /user/hive/warehouse directory in
> 0.17.
> > Are you
> > >> saying
> > >>> that
> > >>>>
> > >>>> hadoop dfs -mkdir /user/facebook/hive
> > >>>>
> > >>>> does not work for you? Can you send
> out
> > the
> > >> output
> > >>> when you run this
> > >>>> command.
> > >>>>
> > >>>> Ashish
> > >>>>
> > >>>> PS: using
> > -Dhadoop.versoion="0.17.0"
> > >> for all
> > >>> the commands that are
> > >>>> given in the wiki should make things
> work
> > in
> > >> eclipse.
> > >>>>
> > >>>> -----Original Message-----
> > >>>> From: Shyam Sarkar
> > >> [mailto:shyam_sarkar@yahoo.com]
> > >>>> Sent: Tuesday, February 03, 2009
> 12:00 PM
> > >>>> To: hive-dev@hadoop.apache.org;
> Ashish
> > Thusoo
> > >>>> Subject: RE: Eclipse run fails !!
> > >>>>
> > >>>> Dear Ashish,
> > >>>>
> > >>>> For the last few days I tried eclipse
> > 3.4.1 with
> > >>> 0.17.2.1 version and
> > >>>> got the same errors with run->run.
> Then
> > I
> > >> looked
> > >>> into bin/hive command
> > >>>> and found that it could not create
> table
> > in HDFS.
> > >> The
> > >>> reason was that
> > >>>> I could not create
> /user/hive/warehouse
> > directory
> > >>> inside HDFS. It was
> > >>>> using Linux FS.
> > >>>> This is why I switched to 0.19.0
> where
> > >> directories in
> > >>> HDFS can be
> > >>>> created.
> > >>>>
> > >>>> Could you please tell me which exact
> > version of
> > >> hadoop
> > >>> will work fine
> > >>>> with eclipse runs ? I want to get
> rid of
> > errors
> > >> in
> > >>> project itself
> > >>>> (before any run).
> > >>>>
> > >>>> Regards,
> > >>>> Shyam
> > >>>>
> > >>>> --- On Tue, 2/3/09, Ashish Thusoo
> > >>>> <at...@facebook.com> wrote:
> > >>>>
> > >>>>> From: Ashish Thusoo
> > >> <at...@facebook.com>
> > >>>>> Subject: RE: Eclipse run fails !!
> > >>>>> To:
> > "hive-dev@hadoop.apache.org"
> > >>>> <hi...@hadoop.apache.org>,
> > >>>>>
> "shyam_sarkar@yahoo.com"
> > >>>> <sh...@yahoo.com>
> > >>>>> Date: Tuesday, February 3, 2009,
> 11:38
> > AM Hi
> > >>> Shyam,
> > >>>>>
> > >>>>> We have not really tried the
> eclipse
> > stuff
> > >> for
> > >>> 0.19.0.
> > >>>> Is it possible
> > >>>>> for you to use 0.17.0 for now,
> while
> > we
> > >> figure
> > >>> this
> > >>>> out...
> > >>>>>
> > >>>>> Ashish
> > >>>>>
> > >>>>> -----Original Message-----
> > >>>>> From: Shyam Sarkar
> > >>> [mailto:shyam_sarkar@yahoo.com]
> > >>>>> Sent: Tuesday, February 03, 2009
> 11:26
> > AM
> > >>>>> To: hive-dev@hadoop.apache.org
> > >>>>> Subject: Eclipse run fails !!
> > >>>>>
> > >>>>> Hello,
> > >>>>>
> > >>>>> I have hive project loaded inside
> > eclipse
> > >> 3.4.1
> > >>> and
> > >>>> hadoop 0.19.0 is
> > >>>>> running in the background. I
> could
> > create
> > >> tables
> > >>> from
> > >>>> bin/hive
> > >>>>> command.
> > >>>>> But when I try to run->run
> inside
> > eclipse
> > >> it
> > >>> says::
> > >>>>>
> > >>>>> "Errors exist with required
> > project(s):
> > >>>>>
> > >>>>> hive
> > >>>>>
> > >>>>> Proceed with launch ?"
> > >>>>>
> > >>>>> and then it gives many errors.
> > >>>>>
> > >>>>> Can someone please tell me why
> there
> > are
> > >> errors
> > >>> in
> > >>>> project hive ? I
> > >>>>> followed all steps correctly from
> hive
> > wiki.
> > >>>>>
> > >>>>> Regards,
> > >>>>> shyam_sarkar@yahoo.com
> > >
> > >
> > >
Re: Eclipse run fails !!
Posted by Shyam Sarkar <sh...@yahoo.com>.
Dear Raghu and others,
I removed @Override directives and error count came down to three. I am still getting following error with override issues ::
Description Resource Path Location Type
The method getEvalMethod(List<Class<?>>) of type ComparisonOpMethodResolver must override a superclass method ComparisonOpMethodResolver.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 54 Java Problem
=============================See the following class==================
public class ComparisonOpMethodResolver implements UDFMethodResolver {
/**
* The udfclass for which resolution is needed.
*/
private Class<? extends UDF> udfClass;
/**
* Constuctor.
*/
public ComparisonOpMethodResolver(Class<? extends UDF> udfClass) {
this.udfClass = udfClass;
}
/* (non-Javadoc)
* @see org.apache.hadoop.hive.ql.exec.UDFMethodResolver#getEvalMethod(java.util.List)
*/
/* @Override */
public Method getEvalMethod(List<Class<?>> argClasses)
throws AmbiguousMethodException {
assert(argClasses.size() == 2);
List<Class<?>> pClasses = null;
if (argClasses.get(0) == Void.class ||
argClasses.get(1) == Void.class) {
pClasses = new ArrayList<Class<?>>();
pClasses.add(Double.class);
pClasses.add(Double.class);
}
else if (argClasses.get(0) == argClasses.get(1)) {
pClasses = argClasses;
}
else if (argClasses.get(0) == java.sql.Date.class ||
argClasses.get(1) == java.sql.Date.class) {
pClasses = new ArrayList<Class<?>>();
pClasses.add(java.sql.Date.class);
pClasses.add(java.sql.Date.class);
}
else {
pClasses = new ArrayList<Class<?>>();
pClasses.add(Double.class);
pClasses.add(Double.class);
}
Method udfMethod = null;
for(Method m: Arrays.asList(udfClass.getMethods())) {
if (m.getName().equals("evaluate")) {
Class<?>[] argumentTypeInfos = m.getParameterTypes();
boolean match = (argumentTypeInfos.length == pClasses.size());
for(int i=0; i<pClasses.size() && match; i++) {
Class<?> accepted = ObjectInspectorUtils.generalizePrimitive(argumentTypeInfos[i]);
if (accepted != pClasses.get(i)) {
match = false;
}
}
if (match) {
if (udfMethod != null) {
throw new AmbiguousMethodException(udfClass, argClasses);
}
else {
udfMethod = m;
}
}
}
}
return udfMethod;
}
}
=============================================================
Other two errors are ::
Description Resource Path Location Type
The method getEvalMethod(List<Class<?>>) of type NumericOpMethodResolver must override a superclass method NumericOpMethodResolver.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 52 Java Problem
The method getEvalMethod(List<Class<?>>) of type UDFIf.UDFIfMethodResolver must override a superclass method UDFIf.java hive/ql/src/java/org/apache/hadoop/hive/ql/udf line 81 Java Problem
======================================================
Thanks,
Shyam
--- On Tue, 2/3/09, Raghu Murthy <ra...@facebook.com> wrote:
> From: Raghu Murthy <ra...@facebook.com>
> Subject: Re: Eclipse run fails !!
> To: "hive-dev@hadoop.apache.org" <hi...@hadoop.apache.org>, "shyam_sarkar@yahoo.com" <sh...@yahoo.com>
> Date: Tuesday, February 3, 2009, 5:35 PM
> Shyam,
>
> Please go ahead and remove the @Override directives. This
> is an issue with
> eclipse. Versions prior to 3.4.1 actually add @Override to
> classes which
> implement interfaces. Version 3.4.1 correctly complains
> that @Override
> should not be present for methods from interfaces.
>
> Can you submit a jira so that we can fix this?
>
> raghu
>
>
> On 2/3/09 5:28 PM, "Shyam Sarkar"
> <sh...@yahoo.com> wrote:
>
> > Dear Prasad,
> >
> > I did a clean and then performed build all for project
> hive. I am getting 10
> > errors and 1706 warnings. All errors are about
> "must override a superclass
> > method". It seems to be compiler problem. I have
> added jre1.6.0_11 in build
> > JRE. Why is the following problem coming ?
> >
> > UDFMethodResolver is an interface::
> >
> > public interface UDFMethodResolver {
> >
> > public Method
> getEvalMethod(List<Class<?>> argClasses)
> > throws AmbiguousMethodException;
> > }
> >
> > Following method should override above method ::
> >
> > public Method getEvalMethod(List<Class<?>>
> argClasses)
> > throws AmbiguousMethodException {
> > assert(argClasses.size() == 2);
> >
> > List<Class<?>> pClasses = null;
> > if (argClasses.get(0) == Void.class ||
> > argClasses.get(1) == Void.class) {
> > pClasses = new
> ArrayList<Class<?>>();
> > pClasses.add(Double.class);
> > pClasses.add(Double.class);
> > }
> > else if (argClasses.get(0) == argClasses.get(1)) {
> > pClasses = argClasses;
> > }
> > else if (argClasses.get(0) == java.sql.Date.class
> ||
> > argClasses.get(1) == java.sql.Date.class)
> {
> > pClasses = new
> ArrayList<Class<?>>();
> > pClasses.add(java.sql.Date.class);
> > pClasses.add(java.sql.Date.class);
> > }
> > else {
> > pClasses = new
> ArrayList<Class<?>>();
> > pClasses.add(Double.class);
> > pClasses.add(Double.class);
> > }
> >
> > Method udfMethod = null;
> >
> > for(Method m:
> Arrays.asList(udfClass.getMethods())) {
> > if (m.getName().equals("evaluate")) {
> >
> > Class<?>[] argumentTypeInfos =
> m.getParameterTypes();
> >
> > boolean match = (argumentTypeInfos.length ==
> pClasses.size());
> >
> > for(int i=0; i<pClasses.size() &&
> match; i++) {
> > Class<?> accepted =
> >
> ObjectInspectorUtils.generalizePrimitive(argumentTypeInfos[i]);
> > if (accepted != pClasses.get(i)) {
> > match = false;
> > }
> > }
> >
> > if (match) {
> > if (udfMethod != null) {
> > throw new
> AmbiguousMethodException(udfClass, argClasses);
> > }
> > else {
> > udfMethod = m;
> > }
> > }
> > }
> > }
> > return udfMethod;
> > }
> >
> > }
> >
> >
> >
> > =====================Errors and
> Warnings=======================
> > Description Resource Path Location
> Type
> > The method add_partition(Partition) of type
> MetaStoreClient must override a
> > superclass method MetaStoreClient.java
> >
> hive/metastore/src/java/org/apache/hadoop/hive/metastore
> line 466
> > Java Problem
> > The method getEvalMethod(List<Class<?>>)
> of type ComparisonOpMethodResolver
> > must override a superclass method
> ComparisonOpMethodResolver.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 54 Java Problem
> > The method getEvalMethod(List<Class<?>>)
> of type NumericOpMethodResolver must
> > override a superclass method
> NumericOpMethodResolver.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 52 Java Problem
> > The method getEvalMethod(List<Class<?>>)
> of type UDFIf.UDFIfMethodResolver
> > must override a superclass method UDFIf.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/udf line
> 81 Java Problem
> > The method process(Node, NodeProcessorCtx, Object...)
> of type
> > TypeCheckProcFactory.BoolExprProcessor must override a
> superclass method
> > TypeCheckProcFactory.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
> line 205 Java
> > Problem
> > The method process(Node, NodeProcessorCtx, Object...)
> of type
> > TypeCheckProcFactory.ColumnExprProcessor must override
> a superclass method
> > TypeCheckProcFactory.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
> line 245 Java
> > Problem
> > The method process(Node, NodeProcessorCtx, Object...)
> of type
> > TypeCheckProcFactory.DefaultExprProcessor must
> override a superclass method
> > TypeCheckProcFactory.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
> line 584 Java
> > Problem
> > The method process(Node, NodeProcessorCtx, Object...)
> of type
> > TypeCheckProcFactory.NullExprProcessor must override a
> superclass method
> > TypeCheckProcFactory.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
> line 94 Java Problem
> > The method process(Node, NodeProcessorCtx, Object...)
> of type
> > TypeCheckProcFactory.NumExprProcessor must override a
> superclass method
> > TypeCheckProcFactory.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
> line 121 Java
> > Problem
> > The method process(Node, NodeProcessorCtx, Object...)
> of type
> > TypeCheckProcFactory.StrExprProcessor must override a
> superclass method
> > TypeCheckProcFactory.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
> line 163 Java
> > Problem
> > AbstractList is a raw type. References to generic type
> AbstractList<E> should
> > be parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 361 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 131 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 135 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 139 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 143 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 143 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 234 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 306 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 307 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 370 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized JJTthrift_grammarState.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 13 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized JJTthrift_grammarState.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 14 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized OptionsProcessor.java
> > hive/cli/src/java/org/apache/hadoop/hive/cli line
> 76 Java Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized OptionsProcessor.java
> > hive/cli/src/java/org/apache/hadoop/hive/cli line
> 76 Java Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ScriptOperator.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 397 Java Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ScriptOperator.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 397 Java Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized Utilities.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 249 Java Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized Utilities.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 250 Java Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized Utilities.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 250 Java Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized thrift_grammar.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 2283
> > Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ByteStreamTypedSerDe.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 70 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ColumnInfo.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 56 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 158 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 186 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 193 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 200 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 377 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 389 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 404 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 412 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ConstantTypedSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 65 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ConstantTypedSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 70 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ConstantTypedSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 74 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ConstantTypedSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 78 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeStructBase.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 50 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypeBase.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 44 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameerized DynamicSerDeTypeBool.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 70 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypeDouble.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 67 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypeList.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 40 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypeMap.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 46 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypeMap.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 48 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypeMap.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 49 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypeSet.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 49 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypeSet.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 51 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypeString.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 45 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypei16.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 37 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypei32.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 65 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypei64.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 37 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized FetchTask.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 112 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized FetchTask.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 113 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized FetchTask.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 115 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized HiveInputFormat.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/io line
> 148 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized HiveInputFormat.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/io line
> 149 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized HiveInputFormat.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/io line
> 151 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized HiveInputFormat.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/io line
> 173 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized HiveInputFormat.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/io line
> 209 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized JuteSerDe.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde/jute
> line 79 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized JuteSerDe.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde/jute
> line 96 Java Problem
> > Class is a raw type. References to generic type
> Cass<T> should be
> > parameterized MapOperator.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 96 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized MetadataTypedSerDeField.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> line 44 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized MetadataTypedSerDeField.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> line 93 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized MetadataTypedSerDeField.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> line 98 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized MetadataTypedSerDeField.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> line 102
> > Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized MetadataTypedSerDeField.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> line 106
> > Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized PrimitiveTypeInfo.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo
> line 39 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized PrimitiveTypeInfo.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo
> line 52 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized PrimitiveTypeInfo.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo
> line 58 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized PrimitiveTypeInfo.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo
> line 66 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized RandomDimension.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/metadata
> line 32 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 30 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 31 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 36 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 37 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 41 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 81 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 83 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 84 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 119 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 123 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 131 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 139 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> >parameterized SerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 68 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 73 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 78 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 83 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 33 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 33 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 35 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 43 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 85 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 122 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 123 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized TReflectionUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 27 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized TReflectionUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 32 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized TReflectionUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 45 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ThriftByteStreamTypedSerDe.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line
> 100 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ThriftByteStreamTypedSerDe.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line
> 124 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ThriftSerDeField.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line
> 33 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized UDFIf.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/udf line
> 67 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized Utilities.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 493 Java Problem
> >
> >
> =======================================================================
> >
> > Do you have any suggestion ?
> >
> > Thanks,
> > Shyam
> >
> >
> >
> > --- On Tue, 2/3/09, Prasad Chakka
> <pr...@facebook.com> wrote:
> >
> >> From: Prasad Chakka <pr...@facebook.com>
> >> Subject: Re: Eclipse run fails !!
> >> To: "shyam_sarkar@yahoo.com"
> <sh...@yahoo.com>,
> >> "hive-dev@hadoop.apache.org"
> <hi...@hadoop.apache.org>
> >> Date: Tuesday, February 3, 2009, 4:57 PM
> >> There are compilation errors in Hive project so
> that is why
> >> running tests is causing issues. Could you send
> what are the
> >> compilation errors?
> >> One of the errors should be on following line.
> Itmost
> >> probably a Eclipse and java issue. You can most
> probably
> >> remove the @override annotation and get successful
> >> compilation. If there are any more errors send
> them to us.
> >>
> >> The method
> getEvalMethod(List<Class<?>>) of
> >> type NumericOpMethodResolver must override a
> superclass
> >> method
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
> >> pMethodResolver.java:52)
> >>
> >>
> >> ________________________________
> >> From: Shyam Sarkar <sh...@yahoo.com>
> >> Reply-To: <sh...@yahoo.com>
> >> Date: Tue, 3 Feb 2009 16:51:47 -0800
> >> To: <hi...@hadoop.apache.org>, Prasad
> Chakka
> >> <pr...@facebook.com>
> >> Subject: Re: Eclipse run fails !!
> >>
> >> Dear Prasad,
> >>
> >> I followed your instructions with 0.17.2.1 hadoop
> version
> >> and changed jre to version 1.6_11. When I ran
> JUnit test, I
> >> still got the following message :
> >>
> >> "Errors exist in required Project(s):
> >> hive
> >> Proceed with Launch ?"
> >>
> >> When I launched I got following errors ::
> >> =================================== It is long
> >> ======================
> >> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -1598728140.txt
> >> Begin query: sample6.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff
> >>
> /home/ssarkar/hive/build/ql/test/logs/positive/sample6.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/sample6.q.out
> >> Exception: Unresolved compilation problem:
> >> The method
> >> getEvalMethod(List<Class<?>>) of type
> >> NumericOpMethodResolver must override a superclass
> method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method
> >> getEvalMethod(List<Class<?>>) of type
> >> NumericOpMethodResolver must override a superclass
> method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
> >> pMethodResolver.java:52)
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
> >> .java:274)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> >> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> >> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSamplePredicate(SemanticA
> >> nalyzer.java:2872)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genTablePlan(SemanticAnalyze
> >> r.java:2985)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3027)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample6(TestParse.java:10
> >> 44)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1450160017.txt
> >> Begin query: sample7.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff
> >>
> /home/ssarkar/hive/build/ql/test/logs/positive/sample7.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/sample7.q.out
> >> Exception: Unresolved compilation problem:
> >> The method
> >> getEvalMethod(List<Class<?>>) of type
> >> NumericOpMethodResolver must override a superclass
> method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method
> >> getEvalMethod(List<Class<?>>) of type
> >> NumericOpMethodResolver must override a superclass
> method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
> >> pMethodResolver.java:52)
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
> >> .java:274)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> >> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> >> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSamplePredicate(SemanticA
> >> nalyzer.java:2872)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genTablePlan(SemanticAnalyze
> >> r.java:2985)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3027)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample7(TestParse.java:10
> >> 70)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 514371634.txt
> >> Begin query: subq.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff
> >>
> /home/ssarkar/hive/build/ql/test/logs/positive/subq.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/subq.q.out
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> >> er.java:904)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2712)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3000)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3021)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_subq(TestParse.java:1096)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 520907971.txt
> >> Begin query: udf1.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff
> >>
> /home/ssarkar/hive/build/ql/test/logs/positive/udf1.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf1.q.out
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> >> er.java:904)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2712)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf1(TestParse.java:1122)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 656206857.txt
> >> Begin query: udf4.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff
> >>
> /home/ssarkar/hive/build/ql/test/logs/positive/udf4.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf4.q.out
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> >> er.java:1167)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2724)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf4(TestParse.java:1148)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >>
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 545867528.txt
> >> Begin query: udf6.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff
> >>
> /home/ssarkar/hive/build/ql/test/logs/positive/udf6.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf6.q.out
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> >> er.java:1167)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2724)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf6(TestParse.java:1174)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -1947338661.txt
> >> Begin query: union.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff
> >>
> /home/ssarkar/hive/build/ql/test/logs/positive/union.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/union.q.out
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> >> er.java:904)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2712)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3000)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3003)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3021)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_union(TestParse.java:1200>>
> )
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Table doesnotexist does not exist
> >> Testing Filter Operator
> >> java.lang.Error: Unresolved compilation problem:
> >> The method
> >> getEvalMethod(List<Class<?>>) of type
> >> ComparisonOpMethodResolver must override a
> superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.ComparisonOpMethodResolver.getEvalMethod(Compa
> >> risonOpMethodResolver.java:54)
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
> >> .java:274)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> >> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> >> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.TestOperators.testBaseFilterOperator(TestOpera
> >> tors.java:79)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Testing FileSink Operator
> >> FileSink Operator ok
> >> Testing Script Operator
> >> [0] io.o=[1, 01]
> >> [0]
> >>
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> >> ctor@acb988
> >> [1] io.o=[2, 11]
> >> [1]
> >>
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> >> ctor@acb988
> >> [2] io.o=[3, 21]
> >> [2]
> >>
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> >> ctor@acb988
> >> [3] io.o=[4, 31]
> >> [3]
> >>
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> >> ctor@acb988
> >> [4] io.o=[5, 41]
> >> [4]
> >>
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> >> ctor@acb988
> >> Script Operator ok
> >> Testing Map Operator
> >> io1.o.toString() = [[0, 1, 2]]
> >> io2.o.toString() = [[0, 1, 2]]
> >> answer.toString() = [[0, 1, 2]]
> >> io1.o.toString() = [[1, 2, 3]]
> >> io2.o.toString() = [[1, 2, 3]]
> >> answer.toString() = [[1, 2, 3]]
> >> io1.o.toString() = [[2, 3, 4]]
> >> io2.o.toString() = [[2, 3, 4]]
> >> answer.toString() = [[2, 3, 4]]
> >> io1.o.toString() = [[3, 4, 5]]
> >> io2.o.toString() = [[3, 4, 5]]
> >> answer.toString() = [[3, 4, 5]]
> >> io1.o.toString() = [[4, 5, 6]]
> >> io2.o.toString() = [[4, 5, 6]]
> >> answer.toString() = [[4, 5, 6]]
> >> Map Operator ok
> >> JEXL library test ok
> >> Evaluating 1 + 2 for 10000000 times
> >> Evaluation finished: 0.562 seconds, 0.056
> seconds/million
> >> call.
> >> Evaluating __udf__concat.evaluate("1",
> >> "2") for 1000000 times
> >> Evaluation finished: 1.028 seconds, 1.028
> seconds/million
> >> call.
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1713747826.txt
> >> java.io.FileNotFoundException: join1.q (No such
> file or
> >> directory)
> >> at java.io.FileInputStream.open(Native
> Method)
> >> at
> java.io.FileInputStream.<init>(Unknown
> >> Source)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.addFile(QTestUtil.java:188)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.queryListRunner(QTestUtil.java:751)
> >> at
> >>
> org.apache.hadoop.hive.ql.TestMTQueries.testMTQueries1(TestMTQueries.java:51)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> ExprNodeFuncEvaluator ok
> >> ExprNodeColumnEvaluator ok
> >> testExprNodeConversionEvaluator ok
> >> java.lang.Error: Unresolved compilation problem:
> >> The method
> >> getEvalMethod(List<Class<?>>) of type
> >> NumericOpMethodResolver must override a superclass
> method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
> >> pMethodResolver.java:52)
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
> >> .java:274)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> >> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> >> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.TestExpressionEvaluator.testExprNodeSpeed(Test
> >> ExpressionEvaluator.java:168)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> input struct = [234, [firstString, secondString],
> >> {firstKey=1, secondKey=2}, -234, 1.0, -2.5]
> >> Testing protocol:
> >>
> org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
> >> TypeName =
> >>
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> >> ble,nd:double}
> >> bytes
> >>
> =x01x80x00x00xeax01x80x00x00x02x01x66x69x72x73x74x53x74x72x69x6ex67x00x01x73x
> >>
> 65x63x6fx6ex64x53x74x72x69x6ex67x00x01x80x00x00x02x01x66x69x72x73x74x4bx65x79
> >>
> x00x01x80x00x00x01x01x73x65x63x6fx6ex64x4bx65x79x00x01x80x00x00x02x01x7fxffxf
> >>
> fx16x01xbfxf0x00x00x00x00x00x00x01x3fxfbxffxffxffxffxffxff
> >> o class = class java.util.ArrayList
> >> o size = 6
> >> o[0] class = class java.lang.Integer
> >> o[1] class = class java.util.ArrayList
> >> o[2] class = class java.util.HashMap
> >> o = [234, [firstString, secondString],
> {firstKey=1,
> >> secondKey=2}, -234, 1.0, -2.5]
> >> Testing protocol:
> >>
> org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
> >> TypeName =
> >>
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> >> ble,nd:double}
> >> bytes
> >>
> =xfex7fxffxffx15xfex7fxffxffxfdxfex99x96x8dx8cx8bxacx8bx8dx96x91x98xffxfex8cx
> >>
> 9ax9cx90x91x9bxacx8bx8dx96x91x98xffxfex7fxffxffxfdxfex99x96x8dx8cx8bxb4x9ax86
> >>
> xffxfex7fxffxffxfexfex8cx9ax9cx90x91x9bxb4x9ax86xffxfex7fxffxffxfdxfex80x00x0
> >>
> 0xe9xfex40x0fxffxffxffxffxffxffxfexc0x04x00x00x00x00x00x00
> >> o class = class java.util.ArrayList
> >> o size = 6
> >> o[0] class = class java.lang.Integer
> >> o[1] class = class java.util.ArrayList
> >> o[2] class = class java.util.HashMap
> >> o = [234, [firstString, secondString],
> {firstKey=1,
> >> secondKey=2}, -234, 1.0, -2.5]
> >> Testing protocol:
> >> com.facebook.thrift.protocol.TBinaryProtocol
> >> TypeName =
> >>
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> >> ble,nd:double}
> >> bytes
> >>
> =x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x
> >>
> 74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08
> >>
> x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x6
> >>
> 5x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x
> >> 00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
> >> o class = class java.util.ArrayList
> >> o size = 6
> >> o[0] class = class java.lang.Integer
> >> o[1] class = class java.util.ArrayList
> >> o[2] class = class java.util.HashMap
> >> o = [234, [firstString, secondString],
> {firstKey=1,
> >> secondKey=2}, -234, 1.0, -2.5]
> >> Testing protocol:
> >> com.facebook.thrift.protocol.TJSONProtocol
> >> TypeName =
> >>
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> >> ble,nd:double}
> >> bytes
> >>
> =x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x
> >>
> 6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67
> >>
> x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx2
> >>
> 2x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x
> >>
> 73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2c
> >>
> x22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x6
> >>
> 4x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x
> >> 7dx7d
> >> bytes in text
> >>
> ={"-1":{"i32":234},"-2":{"lst":["str",2,"firstString","secondString"]},"-3":{
> >>
> "map":["str","i32",2,{"firstKey":1,"secondKey":2}]},"-4":{"i32":-234},"-5":{"
> >>
> dbl":1.0},"-6":{"dbl":-2.5}}
> >> o class = class java.util.ArrayList
> >> o size = 6
> >> o[0] class = class java.lang.Integer
> >> o[1] class = class java.util.ArrayList
> >> o[2] class = class java.util.HashMap
> >> o = [234, [firstString, secondString],
> {firstKey=1,
> >> secondKey=2}, -234, 1.0, -2.5]
> >> Testing protocol:
> >>
> org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
> >> TypeName =
> >>
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> >> ble,nd:double}
> >> bytes
> >>
> =x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x
> >>
> 69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32
> >> x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
> >> bytes in text
> >>
> =234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
> >> o class = class java.util.ArrayList
> >> o size = 6
> >> o[0] class = class java.lang.Integer
> >> o[1] class = class java.util.ArrayList
> >> o[2] class = class java.util.HashMap
> >> o = [234, [firstString, secondString],
> {firstKey=1,
> >> secondKey=2}, -234, 1.0, -2.5]
> >> Beginning Test testTBinarySortableProtocol:
> >> Testing struct test { double hello}
> >> Testing struct test { i32 hello}
> >> Testing struct test { i64 hello}
> >> Testing struct test { string hello}
> >> Testing struct test { string hello, double
> another}
> >> Test testTBinarySortableProtocol passed!
> >> bytes in text =234 firstStringsecondString
> >> firstKey1secondKey2>
> >> compare to =234 firstStringsecondString
> >> firstKey1secondKey2>
> >> o class = class java.util.ArrayList
> >> o size = 3
> >> o[0] class = class java.lang.Integer
> >> o[1] class = class java.util.ArrayList
> >> o[2] class = class java.util.HashMap
> >> o = [234, [firstString, secondString],
> {firstKey=1,
> >> secondKey=2}]
> >> bytes in text =234 firstStringsecondString
> >> firstKey1secondKey2>
> >> compare to =234 firstStringsecondString
> >> firstKey1secondKey2>
> >> o class = class java.util.ArrayList
> >> o size = 3
> >> o = [234, null, {firstKey=1, secondKey=2}]
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 992344490.txt
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1962723908.txt
> >> OK
> >> OK
> >> Copying data from
> >> file:/home/ssarkar/hive/data/files/kv1.txt
> >> Loading data to table testhivedrivertable
> >> OK
> >> OK
> >> OK
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 247426390.txt
> >> Begin query: altern1.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/altern1.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/altern1.q.out
> >> Done query: altern1.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 587924093.txt
> >> Begin query: bad_sample_clause.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/bad_sample_clause.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/bad_sample_clause.q.out
> >> Done query: bad_sample_clause.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -1415770190.txt
> >> Begin query: clusterbydistributeby.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/clusterbydistributeby.q.
> >> out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/clusterbydistributeby.q
> >> .out
> >> Done query: clusterbydistributeby.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -1882308680.txt
> >> Begin query: clusterbysortby.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/clusterbysortby.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/clusterbysortby.q.out
> >> Done query: clusterbysortby.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1579535074.txt
> >> Begin query: clustern1.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> >> er.java:1167)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2724)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> >> inSemanticAnalyzer.java:43)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
> >> rn1(TestNegativeCliDriver.java:205)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -430224382.txt
> >> Begin query: clustern2.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinReduceSinkChild(Seman
> >> ticAnalyzer.java:2332)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperator(SemanticAnal
> >> yzer.java:2380)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinPlan(SemanticAnalyzer
> >> .java:2444)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3041)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> >> inSemanticAnalyzer.java:43)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
> >> rn2(TestNegativeCliDriver.java:230)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -431481701.txt
> >> Begin query: clustern3.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> >> er.java:1167)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2724)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> >> inSemanticAnalyzer.java:43)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
> >> rn3(TestNegativeCliDriver.java:255)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1179496399.txt
> >> Begin query: clustern4.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> >> er.java:1167)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2724)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> >> inSemanticAnalyzer.java:43)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
> >> rn4(TestNegativeCliDriver.java:280)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1998238474.txt
> >> Begin query: describe_xpath1.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath1.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath1.q.out
> >> Done query: describe_xpath1.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -93672182.txt
> >> Begin query: describe_xpath2.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath2.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath2.q.out
> >> Done query: describe_xpath2.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1401990633.txt
> >> Begin query: describe_xpath3.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath3.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath3.q.out
> >> Done query: describe_xpath3.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 659750364.txt
> >> Begin query: describe_xpath4.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath4.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath4.q.out
> >> Done query: describe_xpath4.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -778063141.txt
> >> Begin query: fileformat_bad_class.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/fileformat_bad_class.q.o
> >> ut
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/fileformat_bad_class.q.
> >> out
> >> Done query: fileformat_bad_class.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1389054449.txt
> >> Begin query: fileformat_void_input.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> >> er.java:904)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2712)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_filefo
> >> rmat_void_input(TestNegativeCliDriver.java:430)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 893718016.txt
> >> Begin query: fileformat_void_output.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/fileformat_void_output.q
> >> .out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/fileformat_void_output.
> >> q.out
> >> Done query: fileformat_void_output.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -1795879737.txt
> >> Begin query: input1.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/input1.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/input1.q.out
> >> Done query: input1.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1786217678.txt
> >> Begin query: input2.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> >> er.java:1167)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2724)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input2
> >> (TestNegativeCliDriver.java:505)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -1429356131.txt
> >> Begin query: input_testxpath4.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> >> er.java:1167)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2724)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> >> inSemanticAnalyzer.java:43)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input_
> >> testxpath4(TestNegativeCliDriver.java:530)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -299734685.txt
> >> Begin query: invalid_create_tbl1.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_create_tbl1.q.ou>>
> t
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_create_tbl1.q.o
> >> ut
> >> Done query: invalid_create_tbl1.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -3796110.txt
> >> Begin query: invalid_create_tbl2.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_create_tbl2.q.ou>>
> t
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_create_tbl2.q.o
> >> ut
> >> Done query: invalid_create_tbl2.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 732040395.txt
> >> Begin query: invalid_select_expression.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_select_expressio
> >> n.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_select_expressi
> >> on.q.out
> >> Done query: invalid_select_expression.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 764555300.txt
> >> Begin query: invalid_tbl_name.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_tbl_name.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_tbl_name.q.out
> >> Done query: invalid_tbl_name.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -1388068500.txt
> >> Begin query: joinneg.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/joinneg.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/joinneg.q.out
> >> Done query: joinneg.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1214860.txt
> >> Begin query: load_wrong_fileformat.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/load_wrong_fileformat.q.
> >> out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/load_wrong_fileformat.q
> >> .out
> >> Done query: load_wrong_fileformat.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -1542677940.txt
> >> Begin query: notable_alias3.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> >> er.java:904)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2712)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notabl
> >> e_alias3(TestNegativeCliDriver.java:705)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -555682788.txt
> >> Begin query: notable_alias4.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinReduceSinkChild(Seman
> >> ticAnalyzer.java:2332)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperator(SemanticAnal
> >> yzer.java:2380)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinPlan(SemanticAnalyzer
> >> .java:2444)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3041)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> >> inSemanticAnalyzer.java:43)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notabl
> >> e_alias4(TestNegativeCliDriver.java:730)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -1604113442.txt
> >> Begin query: strict_pruning.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.NumExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.NumExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$NumExprProcessor.process
> >> (TypeCheckProcFactory.java:121)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genGroupByPlanReduceSinkOper
> >> ator(SemanticAnalyzer.java:1688)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genGroupByPlan2MR(SemanticAn
> >> alyzer.java:1892)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2721)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> >> inSemanticAnalyzer.java:43)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_strict
> >> _pruning(TestNegativeCliDriver.java:755)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -327058962.txt
> >> Begin query: subq_insert.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/subq_insert.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/subq_insert.q.out
> >> Done query: subq_insert.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -196827093.txt
> >> Begin query: union.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/union.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/union.q.out
> >> Done query: union.q
> >>
> =====================================================================
> >>
> >> Thanks,
> >> Shyam
> >>
> >>
> >>
> >>
> >> --- On Tue, 2/3/09, Prasad Chakka
> >> <pr...@facebook.com> wrote:
> >>
> >>> From: Prasad Chakka
> <pr...@facebook.com>
> >>> Subject: Re: Eclipse run fails !!
> >>> To: "hive-dev@hadoop.apache.org"
> >> <hi...@hadoop.apache.org>,
> >> "shyam_sarkar@yahoo.com"
> >> <sh...@yahoo.com>
> >>> Date: Tuesday, February 3, 2009, 2:51 PM
> >>> I think there are multiple issues. Please do
> the
> >> following
> >>>
> >>>
> >>> 1. 'ant clean' in hive directory
> >>> 2. delete project in eclipse
> >>> 3. Don't change any config values in
> >> hive-site.xml
> >>> (revert your changes to fs.default.name etc)
> and
> >> don't
> >>> start HDFS cluster since in unit tests we are
> working
> >> on
> >>> local file system.
> >>> 4. check what java version is 1.6
> >>> 5. Follow the steps in the hive eclipse
> setup wiki
> >> with
> >>> -Dhadoop.version=0.17.2.1
> >>> 6. Open Eclipse and import the project
> >>> 7. Open project preferences and make sure
> that it is
> >>> using java 6. If it is not then change it to
> use java6
> >> (let
> >>> me know if you need help here). If you change
> it then
> >> make
> >>> sure that you rebuild the project by doing a
> clean
> >>> 8. Make sure that there are no compilation
> problems
> >> for
> >>> the hive project (check 'problems' tab
> in the
> >> bottom
> >>> panel of Eclipse)
> >>> 9. Run the Junit test case. It should run
> without
> >> any
> >>> warning dialogs
> >>>
> >>> Let me know which of these steps fail and what
> is the
> >>> error. You need not change any files run a
> junit
> >> testcase.
> >>> Once you are at this point, we can help you in
> setting
> >> up
> >>> command shell that talks to DFS.
> >>>
> >>> Prasad
> >>>
> >>> ________________________________
> >>> From: Ashish Thusoo
> <at...@facebook.com>
> >>> Reply-To: <hi...@hadoop.apache.org>
> >>> Date: Tue, 3 Feb 2009 14:41:12 -0800
> >>> To: <sh...@yahoo.com>,
> >>> <hi...@hadoop.apache.org>
> >>> Subject: RE: Eclipse run fails !!
> >>>
> >>> Actually for running hive through eclipse you
> >> don't
> >>> need to download and start hadoop. Hive tests
> >> automatically
> >>> create a local instance of hdfs and map/reduce
> and are
> >> able
> >>> to run it.
> >>>
> >>> The errors that you are getting seem to
> indicate some
> >> jpox
> >>> plugins missing in eclipse. Prasad is an
> expert in
> >> that area
> >>> and can perhaps comment on that...
> >>>
> >>> Ashish
> >>>
> >>> -----Original Message-----
> >>> From: Shyam Sarkar
> [mailto:shyam_sarkar@yahoo.com]
> >>> Sent: Tuesday, February 03, 2009 2:30 PM
> >>> To: hive-dev@hadoop.apache.org; Ashish Thusoo
> >>> Subject: RE: Eclipse run fails !!
> >>>
> >>> Dear Ashish,
> >>>
> >>> I downloaded hadoop 0.17.0 and tried
> bin/start-all.sh
> >>> script. I got one error ::
> >>>
> >>
> ==============================================================
> >>> [ssarkar@ayush2 hadoop-0.17.0]$
> bin/start-all.sh
> >> starting
> >>> namenode, logging to
> >>>
> >>
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-namenode-ayush2
> >> .out
> >>> ssarkar@localhost's password:
> >>> localhost: starting datanode, logging to
> >>>
> >>
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-datanode-ayush2
> >> .out
> >>> ssarkar@localhost's password:
> >>> localhost: starting secondarynamenode, logging
> to
> >>>
> >>
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-secondarynameno
> >> de-ayush2.out
> >>> localhost: Exception in thread
> "main"
> >>> java.lang.NullPointerException
> >>> localhost: at
> >>>
> >>
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:119)
> >>> localhost: at
> >>>
> >>
> org.apache.hadoop.dfs.SecondaryNameNode.<init>(SecondaryNameNode.java:118)
> >>> localhost: at
> >>>
> >>
> org.apache.hadoop.dfs.SecondaryNameNode.main(SecondaryNameNode.java:495)
> >>> starting jobtracker, logging to
> >>>
> >>
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-jobtracker-ayus
> >> h2.out
> >>> ssarkar@localhost's password:
> >>> localhost: starting tasktracker, logging to
> >>>
> >>
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-tasktracker-ayu
> >> sh2.out
> >>> [ssarkar@ayush2 hadoop-0.17.0]$
> >>>
> >>
> ===================================================================
> >>>
> >>> Next I loaded hive project into eclipse
> following
> >> steps in
> >>> hive wiki.
> >>> I tried Run->Run Configurations->JUnit
> and
> >> selected
> >>> TestTruncate to run but got the following
> error ::
> >>>
> >>> "Errors exist in required Projest(s):
> >>>
> >>> hive
> >>>
> >>> Proceed with Launch ?"
> >>>
> >>> When I launch I got following errors ::
> >>>
> >>>
> >>
> =================================================================
> >>> 09/02/03 14:01:33 INFO
> metastore.HiveMetaStore: 0:
> >> Opening
> >>> raw store with implemenation
> >>>
> class:org.apache.hadoop.hive.metastore.ObjectStore
> >>> 09/02/03 14:01:33 INFO metastore.ObjectStore:
> >> ObjectStore,
> >>> initialize called
> >>> 09/02/03 14:01:33 INFO metastore.ObjectStore:
> found
> >>> resource jpox.properties at
> >>> file:/home/ssarkar/hive/conf/jpox.properties
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>> "org.eclipse.ui.views" not
> registered, but
> >> plugin
> >>> "org.eclipse.jdt.junit" defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> "org.eclipse.ui.perspectiveExtensions" not
> >>> registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>> "org.eclipse.ui.preferencePages" not
> >> registered,
> >>> but plugin "org.eclipse.jdt.junit"
> defined
> >> in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>> "org.eclipse.ui.keywords" not
> registered,
> >> but
> >>> plugin "org.eclipse.jdt.junit"
> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> >>
> "org.eclipse.debug.core.launchConfigurationTypes"
> >>> not registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> >>
> "org.eclipse.debug.core.launchConfigurationComparators"
> >>> not registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> >>
> "org.eclipse.debug.ui.launchConfigurationTypeImages"
> >>> not registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> >>
> "org.eclipse.debug.ui.launchConfigurationTabGroups"
> >>> not registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>> "org.eclipse.ui.newWizards" not
> registered,
> >> but
> >>> plugin "org.eclipse.jdt.junit"
> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>> "org.eclipse.ui.popupMenus" not
> registered,
> >> but
> >>> plugin "org.eclipse.jdt.junit"
> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>> "org.eclipse.ui.actionSets" not
> registered,
> >> but
> >>> plugin "org.eclipse.jdt.junit"
> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> "org.eclipse.ui.actionSetPartAssociations"
> >> not
> >>> registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> "org.eclipse.debug.ui.launchShortcuts" not
> >>> registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> >>
> "org.eclipse.jdt.core.classpathVariableInitializer"
> >>> not registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> "org.eclipse.jdt.ui.quickFixProcessors" not
> >>> registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> "org.eclipse.jdt.ui.classpathFixProcessors"
> >> not
> >>> registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> "org.eclipse.ui.ide.markerResolution" not
> >>> registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> >>
> "org.eclipse.core.expressions.propertyTesters" not
> >>> registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> >>
> "org.eclipse.ltk.core.refactoring.renameParticipants"
> >>> not registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>> "org.eclipse.ui.commands" not
> registered,
> >> but
> >>> plugin "org.eclipse.jdt.junit"
> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>> "org.eclipse.ui.bindings" not
> registered,
> >> but
> >>> plugin "org.eclipse.jdt.junit"
> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> "org.eclipse.core.runtime.preferences" not
> >>> registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> >>
> "org.eclipse.jdt.core.classpathContainerInitializer"
> >>> not registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> "org.eclipse.jdt.ui.classpathContainerPage"
> >> not
> >>> registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.ui.ide" but it cannot
> be
> >> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.ui.views" but it cannot
> be
> >> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.jface.text" but it
> cannot be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>>
> "org.eclipse.ui.workbench.texteditor" but it
> >>> cannot be resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.ui.editors" but it
> cannot be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.ui" but it cannot be
> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.core.expressions" but
> it cannot
> >> be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.core.resources" but it
> cannot
> >> be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.debug.core" but it
> cannot be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.debug.ui" but it cannot
> be
> >> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.jdt.core" but it cannot
> be
> >> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.jdt.ui" but it cannot
> be
> >> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.core.runtime" but it
> cannot be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.jdt.launching" but it
> cannot be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.jdt.debug.ui" but it
> cannot be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.compare" but it cannot
> be
> >> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.ltk.core.refactoring"
> but it
> >> cannot
> >>> be resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.core.variables" but it
> cannot
> >> be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.ltk.ui.refactoring" but
> it
> >> cannot be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit.runtime"
> requires
> >>> "org.junit" but it cannot be
> resolved.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Bundle
> >>> "org.jpox" has an optional
> dependency to
> >>> "org.eclipse.equinox.registry" but
> it cannot
> >> be
> >>> resolved
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Bundle
> >>> "org.jpox" has an optional
> dependency to
> >>> "org.eclipse.core.runtime" but it
> cannot be
> >>> resolved
> >>> 09/02/03 14:01:33 INFO JPOX.Persistence:
> >> =================
> >>> Persistence Configuration ===============
> >>> 09/02/03 14:01:33 INFO JPOX.Persistence: JPOX
> >> Persistence
> >>> Factory - Vendor: "JPOX" Version:
> >>> "1.2.2"
> >>> 09/02/03 14:01:33 INFO JPOX.Persistence: JPOX
> >> Persistence
> >>> Factory initialised for datastore
> >>>
> >>
> URL="jdbc:derby:;databaseName=../build/test/junit_metastore_db;create=true"
> >>>
> >>
> driver="org.apache.derby.jdbc.EmbeddedDriver"
> >>> userName="APP"
> >>> 09/02/03 14:01:33 INFO JPOX.Persistence:
> >>>
> >>
> ===========================================================
> >>> 09/02/03 14:01:35 INFO Datastore.Schema:
> Initialising
> >>> Catalog "", Schema "APP"
> using
> >>> "SchemaTable" auto-start option
> >>> 09/02/03 14:01:36 INFO Datastore.Schema:
> Catalog
> >>> "", Schema "APP"
> initialised -
> >> managing
> >>> 0 classes
> >>> 09/02/03 14:01:36 INFO JPOX.JDO: >>
> Found
> >>> StoreManager org.jpox.store.rdbms.RDBMSManager
> >>> java.lang.UnsupportedClassVersionError: Bad
> version
> >> number
> >>> in .class file
> >>> at
> java.lang.ClassLoader.defineClass1(Native
> >>> Method)
> >>> at
> >>>
> >>
> java.lang.ClassLoader.defineClass(ClassLoader.java:620)
> >>> at
> >>>
> >>
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
> >>> at
> >>>
> >>
> java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
> >>> at
> >>>
> >>
> java.net.URLClassLoader.access$100(URLClassLoader.java:56)
> >>> at
> >>>
> java.net.URLClassLoader$1.run(URLClassLoader.java:195)
> >>> at
> >>>
> java.security.AccessController.doPrivileged(Native
> >> Method)
> >>> at
> >>>
> >>
> java.net.URLClassLoader.findClass(URLClassLoader.java:188)
> >>> at
> >>>
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >>> at
> >>>
> >>
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
> >>> at
> >>>
> java.lang.ClassLoader.loadClass(ClassLoader.java:251)
> >>> at
> >>>
> >>
> java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:180)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStor
> >> e.java:194)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:124)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:103)
> >>> at
> >>>
> >>
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:54)
> >>> at
> >>>
> >>
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:82)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore
> >> .java:127)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(Hiv
> >> eMetaStore.java:143)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.
> >> java:115)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStor
> >> e.java:100)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClie
> >> nt.java:73)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:785)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:798)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:316)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:300)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:10
> >> 5)
> >>> at
> >>>
> >>
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> >>> Method)
> >>> at
> >>>
> >>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccess
> >> orImpl.java:39)
> >>> at
> >>>
> >>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruct
> >> orAccessorImpl.java:27)
> >>> at
> >>>
> >>
> java.lang.reflect.Constructor.newInstance(Constructor.java:494)
> >>> at
> >>>
> >>
> junit.framework.TestSuite.createTest(TestSuite.java:131)
> >>> at
> >>>
> >>
> junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
> >>> at
> >>>
> >>
> junit.framework.TestSuite.<init>(TestSuite.java:75)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3T
> >> estLoader.java:102)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit
> >> 3TestLoader.java:59)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:445)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >>> java.lang.ExceptionInInitializerError
> >>> at
> >>>
> >>
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> >>> Method)
> >>> at
> >>>
> >>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccess
> >> orImpl.java:39)
> >>> at
> >>>
> >>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruct
> >> orAccessorImpl.java:27)
> >>> at
> >>>
> >>
> java.lang.reflect.Constructor.newInstance(Constructor.java:494)
> >>> at
> >>>
> >>
> junit.framework.TestSuite.createTest(TestSuite.java:131)
> >>> at
> >>>
> >>
> junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
> >>> at
> >>>
> >>
> junit.framework.TestSuite.<init>(TestSuite.java:75)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3T
> >> estLoader.java:102)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit
> >> 3TestLoader.java:59)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:445)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >>> Caused by: java.lang.RuntimeException:
> Encountered
> >>> throwable
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:11
> >> 3)
> >>> ... 13 more
> >>>
> >>
> ======================================================================
> >>>
> >>> regards,
> >>> Shyam
> >>>
> >>>
> >>>
> >>>
> >>>
> >>> --- On Tue, 2/3/09, Ashish Thusoo
> >>> <at...@facebook.com> wrote:
> >>>
> >>>> From: Ashish Thusoo
> <at...@facebook.com>
> >>>> Subject: RE: Eclipse run fails !!
> >>>> To: "Shyam Sarkar"
> >>> <sh...@yahoo.com>,
> >>>> "hive-dev@hadoop.apache.org"
> >>> <hi...@hadoop.apache.org>
> >>>> Date: Tuesday, February 3, 2009, 1:46 PM
> Hi
> >> Shyam,
> >>>>
> >>>> I can certainly say that 0.17.0 should
> work with
> >>> eclipse. I have been
> >>>> doing that for a while.
> >>>>
> >>>> Maybe we can concentrate on fixing why you
> are
> >> not
> >>> able to create a
> >>>> table in hdfs. I am not sure why you could
> not
> >> create
> >>> the
> >>>> /user/hive/warehouse directory in 0.17.
> Are you
> >> saying
> >>> that
> >>>>
> >>>> hadoop dfs -mkdir /user/facebook/hive
> >>>>
> >>>> does not work for you? Can you send out
> the
> >> output
> >>> when you run this
> >>>> command.
> >>>>
> >>>> Ashish
> >>>>
> >>>> PS: using
> -Dhadoop.versoion="0.17.0"
> >> for all
> >>> the commands that are
> >>>> given in the wiki should make things work
> in
> >> eclipse.
> >>>>
> >>>> -----Original Message-----
> >>>> From: Shyam Sarkar
> >> [mailto:shyam_sarkar@yahoo.com]
> >>>> Sent: Tuesday, February 03, 2009 12:00 PM
> >>>> To: hive-dev@hadoop.apache.org; Ashish
> Thusoo
> >>>> Subject: RE: Eclipse run fails !!
> >>>>
> >>>> Dear Ashish,
> >>>>
> >>>> For the last few days I tried eclipse
> 3.4.1 with
> >>> 0.17.2.1 version and
> >>>> got the same errors with run->run. Then
> I
> >> looked
> >>> into bin/hive command
> >>>> and found that it could not create table
> in HDFS.
> >> The
> >>> reason was that
> >>>> I could not create /user/hive/warehouse
> directory
> >>> inside HDFS. It was
> >>>> using Linux FS.
> >>>> This is why I switched to 0.19.0 where
> >> directories in
> >>> HDFS can be
> >>>> created.
> >>>>
> >>>> Could you please tell me which exact
> version of
> >> hadoop
> >>> will work fine
> >>>> with eclipse runs ? I want to get rid of
> errors
> >> in
> >>> project itself
> >>>> (before any run).
> >>>>
> >>>> Regards,
> >>>> Shyam
> >>>>
> >>>> --- On Tue, 2/3/09, Ashish Thusoo
> >>>> <at...@facebook.com> wrote:
> >>>>
> >>>>> From: Ashish Thusoo
> >> <at...@facebook.com>
> >>>>> Subject: RE: Eclipse run fails !!
> >>>>> To:
> "hive-dev@hadoop.apache.org"
> >>>> <hi...@hadoop.apache.org>,
> >>>>> "shyam_sarkar@yahoo.com"
> >>>> <sh...@yahoo.com>
> >>>>> Date: Tuesday, February 3, 2009, 11:38
> AM Hi
> >>> Shyam,
> >>>>>
> >>>>> We have not really tried the eclipse
> stuff
> >> for
> >>> 0.19.0.
> >>>> Is it possible
> >>>>> for you to use 0.17.0 for now, while
> we
> >> figure
> >>> this
> >>>> out...
> >>>>>
> >>>>> Ashish
> >>>>>
> >>>>> -----Original Message-----
> >>>>> From: Shyam Sarkar
> >>> [mailto:shyam_sarkar@yahoo.com]
> >>>>> Sent: Tuesday, February 03, 2009 11:26
> AM
> >>>>> To: hive-dev@hadoop.apache.org
> >>>>> Subject: Eclipse run fails !!
> >>>>>
> >>>>> Hello,
> >>>>>
> >>>>> I have hive project loaded inside
> eclipse
> >> 3.4.1
> >>> and
> >>>> hadoop 0.19.0 is
> >>>>> running in the background. I could
> create
> >> tables
> >>> from
> >>>> bin/hive
> >>>>> command.
> >>>>> But when I try to run->run inside
> eclipse
> >> it
> >>> says::
> >>>>>
> >>>>> "Errors exist with required
> project(s):
> >>>>>
> >>>>> hive
> >>>>>
> >>>>> Proceed with launch ?"
> >>>>>
> >>>>> and then it gives many errors.
> >>>>>
> >>>>> Can someone please tell me why there
> are
> >> errors
> >>> in
> >>>> project hive ? I
> >>>>> followed all steps correctly from hive
> wiki.
> >>>>>
> >>>>> Regards,
> >>>>> shyam_sarkar@yahoo.com
> >
> >
> >
Re: Eclipse run fails !!
Posted by Shyam Sarkar <sh...@yahoo.com>.
I added a new entry in Jira for @Override directive eliminations.
Thanks.
Shyam
--- On Tue, 2/3/09, Raghu Murthy <ra...@facebook.com> wrote:
> From: Raghu Murthy <ra...@facebook.com>
> Subject: Re: Eclipse run fails !!
> To: "hive-dev@hadoop.apache.org" <hi...@hadoop.apache.org>, "shyam_sarkar@yahoo.com" <sh...@yahoo.com>
> Date: Tuesday, February 3, 2009, 5:35 PM
> Shyam,
>
> Please go ahead and remove the @Override directives. This
> is an issue with
> eclipse. Versions prior to 3.4.1 actually add @Override to
> classes which
> implement interfaces. Version 3.4.1 correctly complains
> that @Override
> should not be present for methods from interfaces.
>
> Can you submit a jira so that we can fix this?
>
> raghu
>
>
> On 2/3/09 5:28 PM, "Shyam Sarkar"
> <sh...@yahoo.com> wrote:
>
> > Dear Prasad,
> >
> > I did a clean and then performed build all for project
> hive. I am getting 10
> > errors and 1706 warnings. All errors are about
> "must override a superclass
> > method". It seems to be compiler problem. I have
> added jre1.6.0_11 in build
> > JRE. Why is the following problem coming ?
> >
> > UDFMethodResolver is an interface::
> >
> > public interface UDFMethodResolver {
> >
> > public Method
> getEvalMethod(List<Class<?>> argClasses)
> > throws AmbiguousMethodException;
> > }
> >
> > Following method should override above method ::
> >
> > public Method getEvalMethod(List<Class<?>>
> argClasses)
> > throws AmbiguousMethodException {
> > assert(argClasses.size() == 2);
> >
> > List<Class<?>> pClasses = null;
> > if (argClasses.get(0) == Void.class ||
> > argClasses.get(1) == Void.class) {
> > pClasses = new
> ArrayList<Class<?>>();
> > pClasses.add(Double.class);
> > pClasses.add(Double.class);
> > }
> > else if (argClasses.get(0) == argClasses.get(1)) {
> > pClasses = argClasses;
> > }
> > else if (argClasses.get(0) == java.sql.Date.class
> ||
> > argClasses.get(1) == java.sql.Date.class)
> {
> > pClasses = new
> ArrayList<Class<?>>();
> > pClasses.add(java.sql.Date.class);
> > pClasses.add(java.sql.Date.class);
> > }
> > else {
> > pClasses = new
> ArrayList<Class<?>>();
> > pClasses.add(Double.class);
> > pClasses.add(Double.class);
> > }
> >
> > Method udfMethod = null;
> >
> > for(Method m:
> Arrays.asList(udfClass.getMethods())) {
> > if (m.getName().equals("evaluate")) {
> >
> > Class<?>[] argumentTypeInfos =
> m.getParameterTypes();
> >
> > boolean match = (argumentTypeInfos.length ==
> pClasses.size());
> >
> > for(int i=0; i<pClasses.size() &&
> match; i++) {
> > Class<?> accepted =
> >
> ObjectInspectorUtils.generalizePrimitive(argumentTypeInfos[i]);
> > if (accepted != pClasses.get(i)) {
> > match = false;
> > }
> > }
> >
> > if (match) {
> > if (udfMethod != null) {
> > throw new
> AmbiguousMethodException(udfClass, argClasses);
> > }
> > else {
> > udfMethod = m;
> > }
> > }
> > }
> > }
> > return udfMethod;
> > }
> >
> > }
> >
> >
> >
> > =====================Errors and
> Warnings=======================
> > Description Resource Path Location
> Type
> > The method add_partition(Partition) of type
> MetaStoreClient must override a
> > superclass method MetaStoreClient.java
> >
> hive/metastore/src/java/org/apache/hadoop/hive/metastore
> line 466
> > Java Problem
> > The method getEvalMethod(List<Class<?>>)
> of type ComparisonOpMethodResolver
> > must override a superclass method
> ComparisonOpMethodResolver.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 54 Java Problem
> > The method getEvalMethod(List<Class<?>>)
> of type NumericOpMethodResolver must
> > override a superclass method
> NumericOpMethodResolver.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 52 Java Problem
> > The method getEvalMethod(List<Class<?>>)
> of type UDFIf.UDFIfMethodResolver
> > must override a superclass method UDFIf.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/udf line
> 81 Java Problem
> > The method process(Node, NodeProcessorCtx, Object...)
> of type
> > TypeCheckProcFactory.BoolExprProcessor must override a
> superclass method
> > TypeCheckProcFactory.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
> line 205 Java
> > Problem
> > The method process(Node, NodeProcessorCtx, Object...)
> of type
> > TypeCheckProcFactory.ColumnExprProcessor must override
> a superclass method
> > TypeCheckProcFactory.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
> line 245 Java
> > Problem
> > The method process(Node, NodeProcessorCtx, Object...)
> of type
> > TypeCheckProcFactory.DefaultExprProcessor must
> override a superclass method
> > TypeCheckProcFactory.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
> line 584 Java
> > Problem
> > The method process(Node, NodeProcessorCtx, Object...)
> of type
> > TypeCheckProcFactory.NullExprProcessor must override a
> superclass method
> > TypeCheckProcFactory.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
> line 94 Java Problem
> > The method process(Node, NodeProcessorCtx, Object...)
> of type
> > TypeCheckProcFactory.NumExprProcessor must override a
> superclass method
> > TypeCheckProcFactory.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
> line 121 Java
> > Problem
> > The method process(Node, NodeProcessorCtx, Object...)
> of type
> > TypeCheckProcFactory.StrExprProcessor must override a
> superclass method
> > TypeCheckProcFactory.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/parse
> line 163 Java
> > Problem
> > AbstractList is a raw type. References to generic type
> AbstractList<E> should
> > be parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 361 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 131 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 135 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 139 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 143 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 143 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 234 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 306 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 307 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 370 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized JJTthrift_grammarState.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 13 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized JJTthrift_grammarState.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 14 Java
> > Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized OptionsProcessor.java
> > hive/cli/src/java/org/apache/hadoop/hive/cli line
> 76 Java Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized OptionsProcessor.java
> > hive/cli/src/java/org/apache/hadoop/hive/cli line
> 76 Java Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ScriptOperator.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 397 Java Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized ScriptOperator.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 397 Java Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized Utilities.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 249 Java Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized Utilities.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 250 Java Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized Utilities.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 250 Java Problem
> > ArrayList is a raw type. References to generic type
> ArrayList<E> should be
> > parameterized thrift_grammar.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 2283
> > Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ByteStreamTypedSerDe.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 70 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ColumnInfo.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 56 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 158 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 186 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 193 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 200 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 377 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 389 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 404 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ComplexSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 412 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ConstantTypedSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 65 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ConstantTypedSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 70 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ConstantTypedSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 74 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ConstantTypedSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 78 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeStructBase.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 50 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypeBase.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 44 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameerized DynamicSerDeTypeBool.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 70 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypeDouble.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 67 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypeList.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 40 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypeMap.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 46 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypeMap.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 48 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypeMap.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 49 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypeSet.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 49 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypeSet.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 51 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypeString.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 45 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypei16.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 37 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypei32.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 65 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized DynamicSerDeTypei64.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type
> line 37 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized FetchTask.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 112 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized FetchTask.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 113 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized FetchTask.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 115 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized HiveInputFormat.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/io line
> 148 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized HiveInputFormat.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/io line
> 149 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized HiveInputFormat.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/io line
> 151 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized HiveInputFormat.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/io line
> 173 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized HiveInputFormat.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/io line
> 209 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized JuteSerDe.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde/jute
> line 79 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized JuteSerDe.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde/jute
> line 96 Java Problem
> > Class is a raw type. References to generic type
> Cass<T> should be
> > parameterized MapOperator.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 96 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized MetadataTypedSerDeField.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> line 44 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized MetadataTypedSerDeField.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> line 93 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized MetadataTypedSerDeField.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> line 98 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized MetadataTypedSerDeField.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> line 102
> > Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized MetadataTypedSerDeField.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta
> line 106
> > Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized PrimitiveTypeInfo.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo
> line 39 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized PrimitiveTypeInfo.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo
> line 52 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized PrimitiveTypeInfo.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo
> line 58 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized PrimitiveTypeInfo.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo
> line 66 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized RandomDimension.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/metadata
> line 32 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 30 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 31 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 36 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 37 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 41 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 81 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 83 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 84 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 119 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 123 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 131 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ReflectionSerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 139 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> >parameterized SerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 68 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 73 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 78 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeField.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 83 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 33 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 33 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 35 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 43 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 85 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 122 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized SerDeUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 123 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized TReflectionUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 27 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized TReflectionUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 32 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized TReflectionUtils.java
> > hive/serde/src/java/org/apache/hadoop/hive/serde
> line 45 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ThriftByteStreamTypedSerDe.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line
> 100 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ThriftByteStreamTypedSerDe.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line
> 124 Java
> > Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized ThriftSerDeField.java
> >
> hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line
> 33 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized UDFIf.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/udf line
> 67 Java Problem
> > Class is a raw type. References to generic type
> Class<T> should be
> > parameterized Utilities.java
> > hive/ql/src/java/org/apache/hadoop/hive/ql/exec line
> 493 Java Problem
> >
> >
> =======================================================================
> >
> > Do you have any suggestion ?
> >
> > Thanks,
> > Shyam
> >
> >
> >
> > --- On Tue, 2/3/09, Prasad Chakka
> <pr...@facebook.com> wrote:
> >
> >> From: Prasad Chakka <pr...@facebook.com>
> >> Subject: Re: Eclipse run fails !!
> >> To: "shyam_sarkar@yahoo.com"
> <sh...@yahoo.com>,
> >> "hive-dev@hadoop.apache.org"
> <hi...@hadoop.apache.org>
> >> Date: Tuesday, February 3, 2009, 4:57 PM
> >> There are compilation errors in Hive project so
> that is why
> >> running tests is causing issues. Could you send
> what are the
> >> compilation errors?
> >> One of the errors should be on following line.
> Itmost
> >> probably a Eclipse and java issue. You can most
> probably
> >> remove the @override annotation and get successful
> >> compilation. If there are any more errors send
> them to us.
> >>
> >> The method
> getEvalMethod(List<Class<?>>) of
> >> type NumericOpMethodResolver must override a
> superclass
> >> method
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
> >> pMethodResolver.java:52)
> >>
> >>
> >> ________________________________
> >> From: Shyam Sarkar <sh...@yahoo.com>
> >> Reply-To: <sh...@yahoo.com>
> >> Date: Tue, 3 Feb 2009 16:51:47 -0800
> >> To: <hi...@hadoop.apache.org>, Prasad
> Chakka
> >> <pr...@facebook.com>
> >> Subject: Re: Eclipse run fails !!
> >>
> >> Dear Prasad,
> >>
> >> I followed your instructions with 0.17.2.1 hadoop
> version
> >> and changed jre to version 1.6_11. When I ran
> JUnit test, I
> >> still got the following message :
> >>
> >> "Errors exist in required Project(s):
> >> hive
> >> Proceed with Launch ?"
> >>
> >> When I launched I got following errors ::
> >> =================================== It is long
> >> ======================
> >> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -1598728140.txt
> >> Begin query: sample6.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff
> >>
> /home/ssarkar/hive/build/ql/test/logs/positive/sample6.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/sample6.q.out
> >> Exception: Unresolved compilation problem:
> >> The method
> >> getEvalMethod(List<Class<?>>) of type
> >> NumericOpMethodResolver must override a superclass
> method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method
> >> getEvalMethod(List<Class<?>>) of type
> >> NumericOpMethodResolver must override a superclass
> method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
> >> pMethodResolver.java:52)
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
> >> .java:274)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> >> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> >> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSamplePredicate(SemanticA
> >> nalyzer.java:2872)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genTablePlan(SemanticAnalyze
> >> r.java:2985)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3027)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample6(TestParse.java:10
> >> 44)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1450160017.txt
> >> Begin query: sample7.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff
> >>
> /home/ssarkar/hive/build/ql/test/logs/positive/sample7.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/sample7.q.out
> >> Exception: Unresolved compilation problem:
> >> The method
> >> getEvalMethod(List<Class<?>>) of type
> >> NumericOpMethodResolver must override a superclass
> method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method
> >> getEvalMethod(List<Class<?>>) of type
> >> NumericOpMethodResolver must override a superclass
> method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
> >> pMethodResolver.java:52)
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
> >> .java:274)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> >> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> >> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSamplePredicate(SemanticA
> >> nalyzer.java:2872)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genTablePlan(SemanticAnalyze
> >> r.java:2985)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3027)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample7(TestParse.java:10
> >> 70)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 514371634.txt
> >> Begin query: subq.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff
> >>
> /home/ssarkar/hive/build/ql/test/logs/positive/subq.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/subq.q.out
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> >> er.java:904)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2712)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3000)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3021)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_subq(TestParse.java:1096)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 520907971.txt
> >> Begin query: udf1.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff
> >>
> /home/ssarkar/hive/build/ql/test/logs/positive/udf1.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf1.q.out
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> >> er.java:904)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2712)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf1(TestParse.java:1122)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 656206857.txt
> >> Begin query: udf4.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff
> >>
> /home/ssarkar/hive/build/ql/test/logs/positive/udf4.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf4.q.out
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> >> er.java:1167)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2724)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf4(TestParse.java:1148)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >>
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 545867528.txt
> >> Begin query: udf6.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff
> >>
> /home/ssarkar/hive/build/ql/test/logs/positive/udf6.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf6.q.out
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> >> er.java:1167)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2724)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf6(TestParse.java:1174)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -1947338661.txt
> >> Begin query: union.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff
> >>
> /home/ssarkar/hive/build/ql/test/logs/positive/union.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/union.q.out
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> >> er.java:904)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2712)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3000)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3003)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3021)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_union(TestParse.java:1200>>
> )
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Table doesnotexist does not exist
> >> Testing Filter Operator
> >> java.lang.Error: Unresolved compilation problem:
> >> The method
> >> getEvalMethod(List<Class<?>>) of type
> >> ComparisonOpMethodResolver must override a
> superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.ComparisonOpMethodResolver.getEvalMethod(Compa
> >> risonOpMethodResolver.java:54)
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
> >> .java:274)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> >> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> >> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.TestOperators.testBaseFilterOperator(TestOpera
> >> tors.java:79)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Testing FileSink Operator
> >> FileSink Operator ok
> >> Testing Script Operator
> >> [0] io.o=[1, 01]
> >> [0]
> >>
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> >> ctor@acb988
> >> [1] io.o=[2, 11]
> >> [1]
> >>
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> >> ctor@acb988
> >> [2] io.o=[3, 21]
> >> [2]
> >>
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> >> ctor@acb988
> >> [3] io.o=[4, 31]
> >> [3]
> >>
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> >> ctor@acb988
> >> [4] io.o=[5, 41]
> >> [4]
> >>
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
> >> ctor@acb988
> >> Script Operator ok
> >> Testing Map Operator
> >> io1.o.toString() = [[0, 1, 2]]
> >> io2.o.toString() = [[0, 1, 2]]
> >> answer.toString() = [[0, 1, 2]]
> >> io1.o.toString() = [[1, 2, 3]]
> >> io2.o.toString() = [[1, 2, 3]]
> >> answer.toString() = [[1, 2, 3]]
> >> io1.o.toString() = [[2, 3, 4]]
> >> io2.o.toString() = [[2, 3, 4]]
> >> answer.toString() = [[2, 3, 4]]
> >> io1.o.toString() = [[3, 4, 5]]
> >> io2.o.toString() = [[3, 4, 5]]
> >> answer.toString() = [[3, 4, 5]]
> >> io1.o.toString() = [[4, 5, 6]]
> >> io2.o.toString() = [[4, 5, 6]]
> >> answer.toString() = [[4, 5, 6]]
> >> Map Operator ok
> >> JEXL library test ok
> >> Evaluating 1 + 2 for 10000000 times
> >> Evaluation finished: 0.562 seconds, 0.056
> seconds/million
> >> call.
> >> Evaluating __udf__concat.evaluate("1",
> >> "2") for 1000000 times
> >> Evaluation finished: 1.028 seconds, 1.028
> seconds/million
> >> call.
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1713747826.txt
> >> java.io.FileNotFoundException: join1.q (No such
> file or
> >> directory)
> >> at java.io.FileInputStream.open(Native
> Method)
> >> at
> java.io.FileInputStream.<init>(Unknown
> >> Source)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.addFile(QTestUtil.java:188)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.queryListRunner(QTestUtil.java:751)
> >> at
> >>
> org.apache.hadoop.hive.ql.TestMTQueries.testMTQueries1(TestMTQueries.java:51)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> ExprNodeFuncEvaluator ok
> >> ExprNodeColumnEvaluator ok
> >> testExprNodeConversionEvaluator ok
> >> java.lang.Error: Unresolved compilation problem:
> >> The method
> >> getEvalMethod(List<Class<?>>) of type
> >> NumericOpMethodResolver must override a superclass
> method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
> >> pMethodResolver.java:52)
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
> >> .java:274)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> >> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
> >> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
> >> at
> >>
> org.apache.hadoop.hive.ql.exec.TestExpressionEvaluator.testExprNodeSpeed(Test
> >> ExpressionEvaluator.java:168)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> input struct = [234, [firstString, secondString],
> >> {firstKey=1, secondKey=2}, -234, 1.0, -2.5]
> >> Testing protocol:
> >>
> org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
> >> TypeName =
> >>
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> >> ble,nd:double}
> >> bytes
> >>
> =x01x80x00x00xeax01x80x00x00x02x01x66x69x72x73x74x53x74x72x69x6ex67x00x01x73x
> >>
> 65x63x6fx6ex64x53x74x72x69x6ex67x00x01x80x00x00x02x01x66x69x72x73x74x4bx65x79
> >>
> x00x01x80x00x00x01x01x73x65x63x6fx6ex64x4bx65x79x00x01x80x00x00x02x01x7fxffxf
> >>
> fx16x01xbfxf0x00x00x00x00x00x00x01x3fxfbxffxffxffxffxffxff
> >> o class = class java.util.ArrayList
> >> o size = 6
> >> o[0] class = class java.lang.Integer
> >> o[1] class = class java.util.ArrayList
> >> o[2] class = class java.util.HashMap
> >> o = [234, [firstString, secondString],
> {firstKey=1,
> >> secondKey=2}, -234, 1.0, -2.5]
> >> Testing protocol:
> >>
> org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
> >> TypeName =
> >>
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> >> ble,nd:double}
> >> bytes
> >>
> =xfex7fxffxffx15xfex7fxffxffxfdxfex99x96x8dx8cx8bxacx8bx8dx96x91x98xffxfex8cx
> >>
> 9ax9cx90x91x9bxacx8bx8dx96x91x98xffxfex7fxffxffxfdxfex99x96x8dx8cx8bxb4x9ax86
> >>
> xffxfex7fxffxffxfexfex8cx9ax9cx90x91x9bxb4x9ax86xffxfex7fxffxffxfdxfex80x00x0
> >>
> 0xe9xfex40x0fxffxffxffxffxffxffxfexc0x04x00x00x00x00x00x00
> >> o class = class java.util.ArrayList
> >> o size = 6
> >> o[0] class = class java.lang.Integer
> >> o[1] class = class java.util.ArrayList
> >> o[2] class = class java.util.HashMap
> >> o = [234, [firstString, secondString],
> {firstKey=1,
> >> secondKey=2}, -234, 1.0, -2.5]
> >> Testing protocol:
> >> com.facebook.thrift.protocol.TBinaryProtocol
> >> TypeName =
> >>
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> >> ble,nd:double}
> >> bytes
> >>
> =x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x
> >>
> 74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08
> >>
> x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x6
> >>
> 5x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x
> >> 00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
> >> o class = class java.util.ArrayList
> >> o size = 6
> >> o[0] class = class java.lang.Integer
> >> o[1] class = class java.util.ArrayList
> >> o[2] class = class java.util.HashMap
> >> o = [234, [firstString, secondString],
> {firstKey=1,
> >> secondKey=2}, -234, 1.0, -2.5]
> >> Testing protocol:
> >> com.facebook.thrift.protocol.TJSONProtocol
> >> TypeName =
> >>
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> >> ble,nd:double}
> >> bytes
> >>
> =x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x
> >>
> 6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67
> >>
> x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx2
> >>
> 2x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x
> >>
> 73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2c
> >>
> x22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x6
> >>
> 4x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x
> >> 7dx7d
> >> bytes in text
> >>
> ={"-1":{"i32":234},"-2":{"lst":["str",2,"firstString","secondString"]},"-3":{
> >>
> "map":["str","i32",2,{"firstKey":1,"secondKey":2}]},"-4":{"i32":-234},"-5":{"
> >>
> dbl":1.0},"-6":{"dbl":-2.5}}
> >> o class = class java.util.ArrayList
> >> o size = 6
> >> o[0] class = class java.lang.Integer
> >> o[1] class = class java.util.ArrayList
> >> o[2] class = class java.util.HashMap
> >> o = [234, [firstString, secondString],
> {firstKey=1,
> >> secondKey=2}, -234, 1.0, -2.5]
> >> Testing protocol:
> >>
> org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
> >> TypeName =
> >>
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
> >> ble,nd:double}
> >> bytes
> >>
> =x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x
> >>
> 69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32
> >> x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
> >> bytes in text
> >>
> =234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
> >> o class = class java.util.ArrayList
> >> o size = 6
> >> o[0] class = class java.lang.Integer
> >> o[1] class = class java.util.ArrayList
> >> o[2] class = class java.util.HashMap
> >> o = [234, [firstString, secondString],
> {firstKey=1,
> >> secondKey=2}, -234, 1.0, -2.5]
> >> Beginning Test testTBinarySortableProtocol:
> >> Testing struct test { double hello}
> >> Testing struct test { i32 hello}
> >> Testing struct test { i64 hello}
> >> Testing struct test { string hello}
> >> Testing struct test { string hello, double
> another}
> >> Test testTBinarySortableProtocol passed!
> >> bytes in text =234 firstStringsecondString
> >> firstKey1secondKey2>
> >> compare to =234 firstStringsecondString
> >> firstKey1secondKey2>
> >> o class = class java.util.ArrayList
> >> o size = 3
> >> o[0] class = class java.lang.Integer
> >> o[1] class = class java.util.ArrayList
> >> o[2] class = class java.util.HashMap
> >> o = [234, [firstString, secondString],
> {firstKey=1,
> >> secondKey=2}]
> >> bytes in text =234 firstStringsecondString
> >> firstKey1secondKey2>
> >> compare to =234 firstStringsecondString
> >> firstKey1secondKey2>
> >> o class = class java.util.ArrayList
> >> o size = 3
> >> o = [234, null, {firstKey=1, secondKey=2}]
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 992344490.txt
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1962723908.txt
> >> OK
> >> OK
> >> Copying data from
> >> file:/home/ssarkar/hive/data/files/kv1.txt
> >> Loading data to table testhivedrivertable
> >> OK
> >> OK
> >> OK
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 247426390.txt
> >> Begin query: altern1.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/altern1.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/altern1.q.out
> >> Done query: altern1.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 587924093.txt
> >> Begin query: bad_sample_clause.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/bad_sample_clause.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/bad_sample_clause.q.out
> >> Done query: bad_sample_clause.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -1415770190.txt
> >> Begin query: clusterbydistributeby.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/clusterbydistributeby.q.
> >> out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/clusterbydistributeby.q
> >> .out
> >> Done query: clusterbydistributeby.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -1882308680.txt
> >> Begin query: clusterbysortby.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/clusterbysortby.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/clusterbysortby.q.out
> >> Done query: clusterbysortby.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1579535074.txt
> >> Begin query: clustern1.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> >> er.java:1167)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2724)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> >> inSemanticAnalyzer.java:43)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
> >> rn1(TestNegativeCliDriver.java:205)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -430224382.txt
> >> Begin query: clustern2.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinReduceSinkChild(Seman
> >> ticAnalyzer.java:2332)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperator(SemanticAnal
> >> yzer.java:2380)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinPlan(SemanticAnalyzer
> >> .java:2444)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3041)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> >> inSemanticAnalyzer.java:43)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
> >> rn2(TestNegativeCliDriver.java:230)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -431481701.txt
> >> Begin query: clustern3.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> >> er.java:1167)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2724)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> >> inSemanticAnalyzer.java:43)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
> >> rn3(TestNegativeCliDriver.java:255)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1179496399.txt
> >> Begin query: clustern4.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> >> er.java:1167)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2724)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> >> inSemanticAnalyzer.java:43)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
> >> rn4(TestNegativeCliDriver.java:280)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1998238474.txt
> >> Begin query: describe_xpath1.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath1.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath1.q.out
> >> Done query: describe_xpath1.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -93672182.txt
> >> Begin query: describe_xpath2.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath2.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath2.q.out
> >> Done query: describe_xpath2.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1401990633.txt
> >> Begin query: describe_xpath3.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath3.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath3.q.out
> >> Done query: describe_xpath3.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 659750364.txt
> >> Begin query: describe_xpath4.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath4.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath4.q.out
> >> Done query: describe_xpath4.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -778063141.txt
> >> Begin query: fileformat_bad_class.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/fileformat_bad_class.q.o
> >> ut
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/fileformat_bad_class.q.
> >> out
> >> Done query: fileformat_bad_class.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1389054449.txt
> >> Begin query: fileformat_void_input.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> >> er.java:904)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2712)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_filefo
> >> rmat_void_input(TestNegativeCliDriver.java:430)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 893718016.txt
> >> Begin query: fileformat_void_output.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/fileformat_void_output.q
> >> .out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/fileformat_void_output.
> >> q.out
> >> Done query: fileformat_void_output.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -1795879737.txt
> >> Begin query: input1.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/input1.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/input1.q.out
> >> Done query: input1.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1786217678.txt
> >> Begin query: input2.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> >> er.java:1167)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2724)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input2
> >> (TestNegativeCliDriver.java:505)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -1429356131.txt
> >> Begin query: input_testxpath4.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
> >> er.java:1167)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2724)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> >> inSemanticAnalyzer.java:43)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input_
> >> testxpath4(TestNegativeCliDriver.java:530)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -299734685.txt
> >> Begin query: invalid_create_tbl1.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_create_tbl1.q.ou>>
> t
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_create_tbl1.q.o
> >> ut
> >> Done query: invalid_create_tbl1.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -3796110.txt
> >> Begin query: invalid_create_tbl2.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_create_tbl2.q.ou>>
> t
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_create_tbl2.q.o
> >> ut
> >> Done query: invalid_create_tbl2.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 732040395.txt
> >> Begin query: invalid_select_expression.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_select_expressio
> >> n.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_select_expressi
> >> on.q.out
> >> Done query: invalid_select_expression.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 764555300.txt
> >> Begin query: invalid_tbl_name.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_tbl_name.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_tbl_name.q.out
> >> Done query: invalid_tbl_name.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -1388068500.txt
> >> Begin query: joinneg.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/joinneg.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/joinneg.q.out
> >> Done query: joinneg.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> 1214860.txt
> >> Begin query: load_wrong_fileformat.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/load_wrong_fileformat.q.
> >> out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/load_wrong_fileformat.q
> >> .out
> >> Done query: load_wrong_fileformat.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -1542677940.txt
> >> Begin query: notable_alias3.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
> >> er.java:904)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2712)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notabl
> >> e_alias3(TestNegativeCliDriver.java:705)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -555682788.txt
> >> Begin query: notable_alias4.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.StrExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
> >> (TypeCheckProcFactory.java:163)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinReduceSinkChild(Seman
> >> ticAnalyzer.java:2332)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperator(SemanticAnal
> >> yzer.java:2380)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinPlan(SemanticAnalyzer
> >> .java:2444)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3041)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> >> inSemanticAnalyzer.java:43)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notabl
> >> e_alias4(TestNegativeCliDriver.java:730)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -1604113442.txt
> >> Begin query: strict_pruning.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> Exception: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.NumExprProcessor
> >> must override a superclass method
> >>
> >> java.lang.Error: Unresolved compilation problem:
> >> The method process(Node, NodeProcessorCtx,
> >> Object...) of type
> TypeCheckProcFactory.NumExprProcessor
> >> must override a superclass method
> >>
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$NumExprProcessor.process
> >> (TypeCheckProcFactory.java:121)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
> >> tcher.java:80)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
> >> java:83)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
> >> :113)
> >> at
> >>
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
> >> ker.java:95)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
> >> yzer.java:3311)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genGroupByPlanReduceSinkOper
> >> ator(SemanticAnalyzer.java:1688)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genGroupByPlan2MR(SemanticAn
> >> alyzer.java:1892)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
> >> .java:2721)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
> >> a:3048)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
> >> yzer.java:3229)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
> >> inSemanticAnalyzer.java:43)
> >> at
> >>
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
> >> yzer.java:71)
> >> at
> >>
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> >> at
> >>
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> >> at
> >>
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> >> at
> >>
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_strict
> >> _pruning(TestNegativeCliDriver.java:755)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> at
> >>
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> >> at
> >>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> >> Source)
> >> at java.lang.reflect.Method.invoke(Unknown
> Source)
> >> at
> >>
> junit.framework.TestCase.runTest(TestCase.java:154)
> >> at
> >>
> junit.framework.TestCase.runBare(TestCase.java:127)
> >> at
> >>
> junit.framework.TestResult$1.protect(TestResult.java:106)
> >> at
> >>
> junit.framework.TestResult.runProtected(TestResult.java:124)
> >> at
> >>
> junit.framework.TestResult.run(TestResult.java:109)
> >> at
> junit.framework.TestCase.run(TestCase.java:118)
> >> at
> >>
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> >> at
> >> junit.framework.TestSuite.run(TestSuite.java:203)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
> >> stReference.java:130)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
> )
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:460)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >> at
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -327058962.txt
> >> Begin query: subq_insert.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/subq_insert.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/subq_insert.q.out
> >> Done query: subq_insert.q
> >> Hive history
> >>
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
> >> -196827093.txt
> >> Begin query: union.q
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-08,
> >> hr=12}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=11}
> >> OK
> >> Loading data to table srcpart partition
> {ds=2008-04-09,
> >> hr=12}
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table srcbucket
> >> OK
> >> Loading data to table src
> >> OK
> >> diff -I \(file:\)\|\(/tmp/.*\)
> >>
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/union.q.out
> >>
> /home/ssarkar/hive/ql/src/test/results/clientnegative/union.q.out
> >> Done query: union.q
> >>
> =====================================================================
> >>
> >> Thanks,
> >> Shyam
> >>
> >>
> >>
> >>
> >> --- On Tue, 2/3/09, Prasad Chakka
> >> <pr...@facebook.com> wrote:
> >>
> >>> From: Prasad Chakka
> <pr...@facebook.com>
> >>> Subject: Re: Eclipse run fails !!
> >>> To: "hive-dev@hadoop.apache.org"
> >> <hi...@hadoop.apache.org>,
> >> "shyam_sarkar@yahoo.com"
> >> <sh...@yahoo.com>
> >>> Date: Tuesday, February 3, 2009, 2:51 PM
> >>> I think there are multiple issues. Please do
> the
> >> following
> >>>
> >>>
> >>> 1. 'ant clean' in hive directory
> >>> 2. delete project in eclipse
> >>> 3. Don't change any config values in
> >> hive-site.xml
> >>> (revert your changes to fs.default.name etc)
> and
> >> don't
> >>> start HDFS cluster since in unit tests we are
> working
> >> on
> >>> local file system.
> >>> 4. check what java version is 1.6
> >>> 5. Follow the steps in the hive eclipse
> setup wiki
> >> with
> >>> -Dhadoop.version=0.17.2.1
> >>> 6. Open Eclipse and import the project
> >>> 7. Open project preferences and make sure
> that it is
> >>> using java 6. If it is not then change it to
> use java6
> >> (let
> >>> me know if you need help here). If you change
> it then
> >> make
> >>> sure that you rebuild the project by doing a
> clean
> >>> 8. Make sure that there are no compilation
> problems
> >> for
> >>> the hive project (check 'problems' tab
> in the
> >> bottom
> >>> panel of Eclipse)
> >>> 9. Run the Junit test case. It should run
> without
> >> any
> >>> warning dialogs
> >>>
> >>> Let me know which of these steps fail and what
> is the
> >>> error. You need not change any files run a
> junit
> >> testcase.
> >>> Once you are at this point, we can help you in
> setting
> >> up
> >>> command shell that talks to DFS.
> >>>
> >>> Prasad
> >>>
> >>> ________________________________
> >>> From: Ashish Thusoo
> <at...@facebook.com>
> >>> Reply-To: <hi...@hadoop.apache.org>
> >>> Date: Tue, 3 Feb 2009 14:41:12 -0800
> >>> To: <sh...@yahoo.com>,
> >>> <hi...@hadoop.apache.org>
> >>> Subject: RE: Eclipse run fails !!
> >>>
> >>> Actually for running hive through eclipse you
> >> don't
> >>> need to download and start hadoop. Hive tests
> >> automatically
> >>> create a local instance of hdfs and map/reduce
> and are
> >> able
> >>> to run it.
> >>>
> >>> The errors that you are getting seem to
> indicate some
> >> jpox
> >>> plugins missing in eclipse. Prasad is an
> expert in
> >> that area
> >>> and can perhaps comment on that...
> >>>
> >>> Ashish
> >>>
> >>> -----Original Message-----
> >>> From: Shyam Sarkar
> [mailto:shyam_sarkar@yahoo.com]
> >>> Sent: Tuesday, February 03, 2009 2:30 PM
> >>> To: hive-dev@hadoop.apache.org; Ashish Thusoo
> >>> Subject: RE: Eclipse run fails !!
> >>>
> >>> Dear Ashish,
> >>>
> >>> I downloaded hadoop 0.17.0 and tried
> bin/start-all.sh
> >>> script. I got one error ::
> >>>
> >>
> ==============================================================
> >>> [ssarkar@ayush2 hadoop-0.17.0]$
> bin/start-all.sh
> >> starting
> >>> namenode, logging to
> >>>
> >>
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-namenode-ayush2
> >> .out
> >>> ssarkar@localhost's password:
> >>> localhost: starting datanode, logging to
> >>>
> >>
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-datanode-ayush2
> >> .out
> >>> ssarkar@localhost's password:
> >>> localhost: starting secondarynamenode, logging
> to
> >>>
> >>
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-secondarynameno
> >> de-ayush2.out
> >>> localhost: Exception in thread
> "main"
> >>> java.lang.NullPointerException
> >>> localhost: at
> >>>
> >>
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:119)
> >>> localhost: at
> >>>
> >>
> org.apache.hadoop.dfs.SecondaryNameNode.<init>(SecondaryNameNode.java:118)
> >>> localhost: at
> >>>
> >>
> org.apache.hadoop.dfs.SecondaryNameNode.main(SecondaryNameNode.java:495)
> >>> starting jobtracker, logging to
> >>>
> >>
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-jobtracker-ayus
> >> h2.out
> >>> ssarkar@localhost's password:
> >>> localhost: starting tasktracker, logging to
> >>>
> >>
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-tasktracker-ayu
> >> sh2.out
> >>> [ssarkar@ayush2 hadoop-0.17.0]$
> >>>
> >>
> ===================================================================
> >>>
> >>> Next I loaded hive project into eclipse
> following
> >> steps in
> >>> hive wiki.
> >>> I tried Run->Run Configurations->JUnit
> and
> >> selected
> >>> TestTruncate to run but got the following
> error ::
> >>>
> >>> "Errors exist in required Projest(s):
> >>>
> >>> hive
> >>>
> >>> Proceed with Launch ?"
> >>>
> >>> When I launch I got following errors ::
> >>>
> >>>
> >>
> =================================================================
> >>> 09/02/03 14:01:33 INFO
> metastore.HiveMetaStore: 0:
> >> Opening
> >>> raw store with implemenation
> >>>
> class:org.apache.hadoop.hive.metastore.ObjectStore
> >>> 09/02/03 14:01:33 INFO metastore.ObjectStore:
> >> ObjectStore,
> >>> initialize called
> >>> 09/02/03 14:01:33 INFO metastore.ObjectStore:
> found
> >>> resource jpox.properties at
> >>> file:/home/ssarkar/hive/conf/jpox.properties
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>> "org.eclipse.ui.views" not
> registered, but
> >> plugin
> >>> "org.eclipse.jdt.junit" defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> "org.eclipse.ui.perspectiveExtensions" not
> >>> registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>> "org.eclipse.ui.preferencePages" not
> >> registered,
> >>> but plugin "org.eclipse.jdt.junit"
> defined
> >> in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>> "org.eclipse.ui.keywords" not
> registered,
> >> but
> >>> plugin "org.eclipse.jdt.junit"
> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> >>
> "org.eclipse.debug.core.launchConfigurationTypes"
> >>> not registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> >>
> "org.eclipse.debug.core.launchConfigurationComparators"
> >>> not registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> >>
> "org.eclipse.debug.ui.launchConfigurationTypeImages"
> >>> not registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> >>
> "org.eclipse.debug.ui.launchConfigurationTabGroups"
> >>> not registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>> "org.eclipse.ui.newWizards" not
> registered,
> >> but
> >>> plugin "org.eclipse.jdt.junit"
> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>> "org.eclipse.ui.popupMenus" not
> registered,
> >> but
> >>> plugin "org.eclipse.jdt.junit"
> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>> "org.eclipse.ui.actionSets" not
> registered,
> >> but
> >>> plugin "org.eclipse.jdt.junit"
> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> "org.eclipse.ui.actionSetPartAssociations"
> >> not
> >>> registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> "org.eclipse.debug.ui.launchShortcuts" not
> >>> registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> >>
> "org.eclipse.jdt.core.classpathVariableInitializer"
> >>> not registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> "org.eclipse.jdt.ui.quickFixProcessors" not
> >>> registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> "org.eclipse.jdt.ui.classpathFixProcessors"
> >> not
> >>> registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> "org.eclipse.ui.ide.markerResolution" not
> >>> registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> >>
> "org.eclipse.core.expressions.propertyTesters" not
> >>> registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> >>
> "org.eclipse.ltk.core.refactoring.renameParticipants"
> >>> not registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>> "org.eclipse.ui.commands" not
> registered,
> >> but
> >>> plugin "org.eclipse.jdt.junit"
> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>> "org.eclipse.ui.bindings" not
> registered,
> >> but
> >>> plugin "org.eclipse.jdt.junit"
> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> "org.eclipse.core.runtime.preferences" not
> >>> registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> >>
> "org.eclipse.jdt.core.classpathContainerInitializer"
> >>> not registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension
> Point
> >>>
> "org.eclipse.jdt.ui.classpathContainerPage"
> >> not
> >>> registered, but plugin
> >> "org.eclipse.jdt.junit"
> >>> defined in
> >>>
> >>
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
> >> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> >>> refers to it.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.ui.ide" but it cannot
> be
> >> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.ui.views" but it cannot
> be
> >> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.jface.text" but it
> cannot be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>>
> "org.eclipse.ui.workbench.texteditor" but it
> >>> cannot be resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.ui.editors" but it
> cannot be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.ui" but it cannot be
> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.core.expressions" but
> it cannot
> >> be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.core.resources" but it
> cannot
> >> be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.debug.core" but it
> cannot be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.debug.ui" but it cannot
> be
> >> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.jdt.core" but it cannot
> be
> >> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.jdt.ui" but it cannot
> be
> >> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.core.runtime" but it
> cannot be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.jdt.launching" but it
> cannot be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.jdt.debug.ui" but it
> cannot be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.compare" but it cannot
> be
> >> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.ltk.core.refactoring"
> but it
> >> cannot
> >>> be resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.core.variables" but it
> cannot
> >> be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit" requires
> >>> "org.eclipse.ltk.ui.refactoring" but
> it
> >> cannot be
> >>> resolved.
> >>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> >>> "org.eclipse.jdt.junit.runtime"
> requires
> >>> "org.junit" but it cannot be
> resolved.
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Bundle
> >>> "org.jpox" has an optional
> dependency to
> >>> "org.eclipse.equinox.registry" but
> it cannot
> >> be
> >>> resolved
> >>> 09/02/03 14:01:33 WARN JPOX.Plugin: Bundle
> >>> "org.jpox" has an optional
> dependency to
> >>> "org.eclipse.core.runtime" but it
> cannot be
> >>> resolved
> >>> 09/02/03 14:01:33 INFO JPOX.Persistence:
> >> =================
> >>> Persistence Configuration ===============
> >>> 09/02/03 14:01:33 INFO JPOX.Persistence: JPOX
> >> Persistence
> >>> Factory - Vendor: "JPOX" Version:
> >>> "1.2.2"
> >>> 09/02/03 14:01:33 INFO JPOX.Persistence: JPOX
> >> Persistence
> >>> Factory initialised for datastore
> >>>
> >>
> URL="jdbc:derby:;databaseName=../build/test/junit_metastore_db;create=true"
> >>>
> >>
> driver="org.apache.derby.jdbc.EmbeddedDriver"
> >>> userName="APP"
> >>> 09/02/03 14:01:33 INFO JPOX.Persistence:
> >>>
> >>
> ===========================================================
> >>> 09/02/03 14:01:35 INFO Datastore.Schema:
> Initialising
> >>> Catalog "", Schema "APP"
> using
> >>> "SchemaTable" auto-start option
> >>> 09/02/03 14:01:36 INFO Datastore.Schema:
> Catalog
> >>> "", Schema "APP"
> initialised -
> >> managing
> >>> 0 classes
> >>> 09/02/03 14:01:36 INFO JPOX.JDO: >>
> Found
> >>> StoreManager org.jpox.store.rdbms.RDBMSManager
> >>> java.lang.UnsupportedClassVersionError: Bad
> version
> >> number
> >>> in .class file
> >>> at
> java.lang.ClassLoader.defineClass1(Native
> >>> Method)
> >>> at
> >>>
> >>
> java.lang.ClassLoader.defineClass(ClassLoader.java:620)
> >>> at
> >>>
> >>
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
> >>> at
> >>>
> >>
> java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
> >>> at
> >>>
> >>
> java.net.URLClassLoader.access$100(URLClassLoader.java:56)
> >>> at
> >>>
> java.net.URLClassLoader$1.run(URLClassLoader.java:195)
> >>> at
> >>>
> java.security.AccessController.doPrivileged(Native
> >> Method)
> >>> at
> >>>
> >>
> java.net.URLClassLoader.findClass(URLClassLoader.java:188)
> >>> at
> >>>
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >>> at
> >>>
> >>
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
> >>> at
> >>>
> java.lang.ClassLoader.loadClass(ClassLoader.java:251)
> >>> at
> >>>
> >>
> java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:180)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStor
> >> e.java:194)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:124)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:103)
> >>> at
> >>>
> >>
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:54)
> >>> at
> >>>
> >>
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:82)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore
> >> .java:127)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(Hiv
> >> eMetaStore.java:143)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.
> >> java:115)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStor
> >> e.java:100)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClie
> >> nt.java:73)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:785)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:798)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:316)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:300)
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:10
> >> 5)
> >>> at
> >>>
> >>
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> >>> Method)
> >>> at
> >>>
> >>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccess
> >> orImpl.java:39)
> >>> at
> >>>
> >>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruct
> >> orAccessorImpl.java:27)
> >>> at
> >>>
> >>
> java.lang.reflect.Constructor.newInstance(Constructor.java:494)
> >>> at
> >>>
> >>
> junit.framework.TestSuite.createTest(TestSuite.java:131)
> >>> at
> >>>
> >>
> junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
> >>> at
> >>>
> >>
> junit.framework.TestSuite.<init>(TestSuite.java:75)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3T
> >> estLoader.java:102)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit
> >> 3TestLoader.java:59)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:445)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >>> java.lang.ExceptionInInitializerError
> >>> at
> >>>
> >>
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> >>> Method)
> >>> at
> >>>
> >>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccess
> >> orImpl.java:39)
> >>> at
> >>>
> >>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruct
> >> orAccessorImpl.java:27)
> >>> at
> >>>
> >>
> java.lang.reflect.Constructor.newInstance(Constructor.java:494)
> >>> at
> >>>
> >>
> junit.framework.TestSuite.createTest(TestSuite.java:131)
> >>> at
> >>>
> >>
> junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
> >>> at
> >>>
> >>
> junit.framework.TestSuite.<init>(TestSuite.java:75)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3T
> >> estLoader.java:102)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit
> >> 3TestLoader.java:59)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:445)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
> >> ner.java:673)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
> >> ava:386)
> >>> at
> >>>
> >>
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
> >> java:196)
> >>> Caused by: java.lang.RuntimeException:
> Encountered
> >>> throwable
> >>> at
> >>>
> >>
> org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:11
> >> 3)
> >>> ... 13 more
> >>>
> >>
> ======================================================================
> >>>
> >>> regards,
> >>> Shyam
> >>>
> >>>
> >>>
> >>>
> >>>
> >>> --- On Tue, 2/3/09, Ashish Thusoo
> >>> <at...@facebook.com> wrote:
> >>>
> >>>> From: Ashish Thusoo
> <at...@facebook.com>
> >>>> Subject: RE: Eclipse run fails !!
> >>>> To: "Shyam Sarkar"
> >>> <sh...@yahoo.com>,
> >>>> "hive-dev@hadoop.apache.org"
> >>> <hi...@hadoop.apache.org>
> >>>> Date: Tuesday, February 3, 2009, 1:46 PM
> Hi
> >> Shyam,
> >>>>
> >>>> I can certainly say that 0.17.0 should
> work with
> >>> eclipse. I have been
> >>>> doing that for a while.
> >>>>
> >>>> Maybe we can concentrate on fixing why you
> are
> >> not
> >>> able to create a
> >>>> table in hdfs. I am not sure why you could
> not
> >> create
> >>> the
> >>>> /user/hive/warehouse directory in 0.17.
> Are you
> >> saying
> >>> that
> >>>>
> >>>> hadoop dfs -mkdir /user/facebook/hive
> >>>>
> >>>> does not work for you? Can you send out
> the
> >> output
> >>> when you run this
> >>>> command.
> >>>>
> >>>> Ashish
> >>>>
> >>>> PS: using
> -Dhadoop.versoion="0.17.0"
> >> for all
> >>> the commands that are
> >>>> given in the wiki should make things work
> in
> >> eclipse.
> >>>>
> >>>> -----Original Message-----
> >>>> From: Shyam Sarkar
> >> [mailto:shyam_sarkar@yahoo.com]
> >>>> Sent: Tuesday, February 03, 2009 12:00 PM
> >>>> To: hive-dev@hadoop.apache.org; Ashish
> Thusoo
> >>>> Subject: RE: Eclipse run fails !!
> >>>>
> >>>> Dear Ashish,
> >>>>
> >>>> For the last few days I tried eclipse
> 3.4.1 with
> >>> 0.17.2.1 version and
> >>>> got the same errors with run->run. Then
> I
> >> looked
> >>> into bin/hive command
> >>>> and found that it could not create table
> in HDFS.
> >> The
> >>> reason was that
> >>>> I could not create /user/hive/warehouse
> directory
> >>> inside HDFS. It was
> >>>> using Linux FS.
> >>>> This is why I switched to 0.19.0 where
> >> directories in
> >>> HDFS can be
> >>>> created.
> >>>>
> >>>> Could you please tell me which exact
> version of
> >> hadoop
> >>> will work fine
> >>>> with eclipse runs ? I want to get rid of
> errors
> >> in
> >>> project itself
> >>>> (before any run).
> >>>>
> >>>> Regards,
> >>>> Shyam
> >>>>
> >>>> --- On Tue, 2/3/09, Ashish Thusoo
> >>>> <at...@facebook.com> wrote:
> >>>>
> >>>>> From: Ashish Thusoo
> >> <at...@facebook.com>
> >>>>> Subject: RE: Eclipse run fails !!
> >>>>> To:
> "hive-dev@hadoop.apache.org"
> >>>> <hi...@hadoop.apache.org>,
> >>>>> "shyam_sarkar@yahoo.com"
> >>>> <sh...@yahoo.com>
> >>>>> Date: Tuesday, February 3, 2009, 11:38
> AM Hi
> >>> Shyam,
> >>>>>
> >>>>> We have not really tried the eclipse
> stuff
> >> for
> >>> 0.19.0.
> >>>> Is it possible
> >>>>> for you to use 0.17.0 for now, while
> we
> >> figure
> >>> this
> >>>> out...
> >>>>>
> >>>>> Ashish
> >>>>>
> >>>>> -----Original Message-----
> >>>>> From: Shyam Sarkar
> >>> [mailto:shyam_sarkar@yahoo.com]
> >>>>> Sent: Tuesday, February 03, 2009 11:26
> AM
> >>>>> To: hive-dev@hadoop.apache.org
> >>>>> Subject: Eclipse run fails !!
> >>>>>
> >>>>> Hello,
> >>>>>
> >>>>> I have hive project loaded inside
> eclipse
> >> 3.4.1
> >>> and
> >>>> hadoop 0.19.0 is
> >>>>> running in the background. I could
> create
> >> tables
> >>> from
> >>>> bin/hive
> >>>>> command.
> >>>>> But when I try to run->run inside
> eclipse
> >> it
> >>> says::
> >>>>>
> >>>>> "Errors exist with required
> project(s):
> >>>>>
> >>>>> hive
> >>>>>
> >>>>> Proceed with launch ?"
> >>>>>
> >>>>> and then it gives many errors.
> >>>>>
> >>>>> Can someone please tell me why there
> are
> >> errors
> >>> in
> >>>> project hive ? I
> >>>>> followed all steps correctly from hive
> wiki.
> >>>>>
> >>>>> Regards,
> >>>>> shyam_sarkar@yahoo.com
> >
> >
> >
Re: Eclipse run fails !!
Posted by Raghu Murthy <ra...@facebook.com>.
Shyam,
Please go ahead and remove the @Override directives. This is an issue with
eclipse. Versions prior to 3.4.1 actually add @Override to classes which
implement interfaces. Version 3.4.1 correctly complains that @Override
should not be present for methods from interfaces.
Can you submit a jira so that we can fix this?
raghu
On 2/3/09 5:28 PM, "Shyam Sarkar" <sh...@yahoo.com> wrote:
> Dear Prasad,
>
> I did a clean and then performed build all for project hive. I am getting 10
> errors and 1706 warnings. All errors are about "must override a superclass
> method". It seems to be compiler problem. I have added jre1.6.0_11 in build
> JRE. Why is the following problem coming ?
>
> UDFMethodResolver is an interface::
>
> public interface UDFMethodResolver {
>
> public Method getEvalMethod(List<Class<?>> argClasses)
> throws AmbiguousMethodException;
> }
>
> Following method should override above method ::
>
> public Method getEvalMethod(List<Class<?>> argClasses)
> throws AmbiguousMethodException {
> assert(argClasses.size() == 2);
>
> List<Class<?>> pClasses = null;
> if (argClasses.get(0) == Void.class ||
> argClasses.get(1) == Void.class) {
> pClasses = new ArrayList<Class<?>>();
> pClasses.add(Double.class);
> pClasses.add(Double.class);
> }
> else if (argClasses.get(0) == argClasses.get(1)) {
> pClasses = argClasses;
> }
> else if (argClasses.get(0) == java.sql.Date.class ||
> argClasses.get(1) == java.sql.Date.class) {
> pClasses = new ArrayList<Class<?>>();
> pClasses.add(java.sql.Date.class);
> pClasses.add(java.sql.Date.class);
> }
> else {
> pClasses = new ArrayList<Class<?>>();
> pClasses.add(Double.class);
> pClasses.add(Double.class);
> }
>
> Method udfMethod = null;
>
> for(Method m: Arrays.asList(udfClass.getMethods())) {
> if (m.getName().equals("evaluate")) {
>
> Class<?>[] argumentTypeInfos = m.getParameterTypes();
>
> boolean match = (argumentTypeInfos.length == pClasses.size());
>
> for(int i=0; i<pClasses.size() && match; i++) {
> Class<?> accepted =
> ObjectInspectorUtils.generalizePrimitive(argumentTypeInfos[i]);
> if (accepted != pClasses.get(i)) {
> match = false;
> }
> }
>
> if (match) {
> if (udfMethod != null) {
> throw new AmbiguousMethodException(udfClass, argClasses);
> }
> else {
> udfMethod = m;
> }
> }
> }
> }
> return udfMethod;
> }
>
> }
>
>
>
> =====================Errors and Warnings=======================
> Description Resource Path Location Type
> The method add_partition(Partition) of type MetaStoreClient must override a
> superclass method MetaStoreClient.java
> hive/metastore/src/java/org/apache/hadoop/hive/metastore line 466
> Java Problem
> The method getEvalMethod(List<Class<?>>) of type ComparisonOpMethodResolver
> must override a superclass method ComparisonOpMethodResolver.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 54 Java Problem
> The method getEvalMethod(List<Class<?>>) of type NumericOpMethodResolver must
> override a superclass method NumericOpMethodResolver.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 52 Java Problem
> The method getEvalMethod(List<Class<?>>) of type UDFIf.UDFIfMethodResolver
> must override a superclass method UDFIf.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/udf line 81 Java Problem
> The method process(Node, NodeProcessorCtx, Object...) of type
> TypeCheckProcFactory.BoolExprProcessor must override a superclass method
> TypeCheckProcFactory.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/parse line 205 Java
> Problem
> The method process(Node, NodeProcessorCtx, Object...) of type
> TypeCheckProcFactory.ColumnExprProcessor must override a superclass method
> TypeCheckProcFactory.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/parse line 245 Java
> Problem
> The method process(Node, NodeProcessorCtx, Object...) of type
> TypeCheckProcFactory.DefaultExprProcessor must override a superclass method
> TypeCheckProcFactory.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/parse line 584 Java
> Problem
> The method process(Node, NodeProcessorCtx, Object...) of type
> TypeCheckProcFactory.NullExprProcessor must override a superclass method
> TypeCheckProcFactory.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/parse line 94 Java Problem
> The method process(Node, NodeProcessorCtx, Object...) of type
> TypeCheckProcFactory.NumExprProcessor must override a superclass method
> TypeCheckProcFactory.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/parse line 121 Java
> Problem
> The method process(Node, NodeProcessorCtx, Object...) of type
> TypeCheckProcFactory.StrExprProcessor must override a superclass method
> TypeCheckProcFactory.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/parse line 163 Java
> Problem
> AbstractList is a raw type. References to generic type AbstractList<E> should
> be parameterized ComplexSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 361 Java
> Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized ComplexSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 131 Java
> Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized ComplexSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 135 Java
> Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized ComplexSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 139 Java
> Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized ComplexSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 143 Java
> Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized ComplexSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 143 Java
> Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized ComplexSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 234 Java
> Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized ComplexSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 306 Java
> Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized ComplexSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 307 Java
> Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized ComplexSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 370 Java
> Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized JJTthrift_grammarState.java
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 13 Java
> Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized JJTthrift_grammarState.java
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 14 Java
> Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized OptionsProcessor.java
> hive/cli/src/java/org/apache/hadoop/hive/cli line 76 Java Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized OptionsProcessor.java
> hive/cli/src/java/org/apache/hadoop/hive/cli line 76 Java Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized ScriptOperator.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 397 Java Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized ScriptOperator.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 397 Java Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized Utilities.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 249 Java Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized Utilities.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 250 Java Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized Utilities.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 250 Java Problem
> ArrayList is a raw type. References to generic type ArrayList<E> should be
> parameterized thrift_grammar.java
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 2283
> Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ByteStreamTypedSerDe.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 70 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ColumnInfo.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 56 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ComplexSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 158 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ComplexSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 186 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ComplexSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 193 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ComplexSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 200 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ComplexSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 377 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ComplexSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 389 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ComplexSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 404 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ComplexSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 412 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ConstantTypedSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 65 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ConstantTypedSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 70 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ConstantTypedSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 74 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ConstantTypedSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 78 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized DynamicSerDeStructBase.java
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 50 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized DynamicSerDeTypeBase.java
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 44 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameerized DynamicSerDeTypeBool.java
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 70 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized DynamicSerDeTypeDouble.java
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 67 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized DynamicSerDeTypeList.java
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 40 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized DynamicSerDeTypeMap.java
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 46 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized DynamicSerDeTypeMap.java
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 48 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized DynamicSerDeTypeMap.java
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 49 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized DynamicSerDeTypeSet.java
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 49 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized DynamicSerDeTypeSet.java
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 51 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized DynamicSerDeTypeString.java
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 45 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized DynamicSerDeTypei16.java
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 37 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized DynamicSerDeTypei32.java
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 65 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized DynamicSerDeTypei64.java
> hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 37 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized FetchTask.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 112 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized FetchTask.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 113 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized FetchTask.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 115 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized HiveInputFormat.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/io line 148 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized HiveInputFormat.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/io line 149 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized HiveInputFormat.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/io line 151 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized HiveInputFormat.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/io line 173 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized HiveInputFormat.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/io line 209 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized JuteSerDe.java
> hive/serde/src/java/org/apache/hadoop/hive/serde/jute line 79 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized JuteSerDe.java
> hive/serde/src/java/org/apache/hadoop/hive/serde/jute line 96 Java Problem
> Class is a raw type. References to generic type Cass<T> should be
> parameterized MapOperator.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 96 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized MetadataTypedSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 44 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized MetadataTypedSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 93 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized MetadataTypedSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 98 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized MetadataTypedSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 102
> Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized MetadataTypedSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 106
> Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized PrimitiveTypeInfo.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo line 39 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized PrimitiveTypeInfo.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo line 52 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized PrimitiveTypeInfo.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo line 58 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized PrimitiveTypeInfo.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo line 66 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized RandomDimension.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/metadata line 32 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ReflectionSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 30 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ReflectionSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 31 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ReflectionSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 36 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ReflectionSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 37 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ReflectionSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 41 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ReflectionSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 81 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ReflectionSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 83 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ReflectionSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 84 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ReflectionSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 119 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ReflectionSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 123 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ReflectionSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 131 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ReflectionSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 139 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
>parameterized SerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 68 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized SerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 73 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized SerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 78 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized SerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 83 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized SerDeUtils.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 33 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized SerDeUtils.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 33 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized SerDeUtils.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 35 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized SerDeUtils.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 43 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized SerDeUtils.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 85 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized SerDeUtils.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 122 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized SerDeUtils.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 123 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized TReflectionUtils.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 27 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized TReflectionUtils.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 32 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized TReflectionUtils.java
> hive/serde/src/java/org/apache/hadoop/hive/serde line 45 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ThriftByteStreamTypedSerDe.java
> hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line 100 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ThriftByteStreamTypedSerDe.java
> hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line 124 Java
> Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized ThriftSerDeField.java
> hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line 33 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized UDFIf.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/udf line 67 Java Problem
> Class is a raw type. References to generic type Class<T> should be
> parameterized Utilities.java
> hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 493 Java Problem
>
> =======================================================================
>
> Do you have any suggestion ?
>
> Thanks,
> Shyam
>
>
>
> --- On Tue, 2/3/09, Prasad Chakka <pr...@facebook.com> wrote:
>
>> From: Prasad Chakka <pr...@facebook.com>
>> Subject: Re: Eclipse run fails !!
>> To: "shyam_sarkar@yahoo.com" <sh...@yahoo.com>,
>> "hive-dev@hadoop.apache.org" <hi...@hadoop.apache.org>
>> Date: Tuesday, February 3, 2009, 4:57 PM
>> There are compilation errors in Hive project so that is why
>> running tests is causing issues. Could you send what are the
>> compilation errors?
>> One of the errors should be on following line. Itmost
>> probably a Eclipse and java issue. You can most probably
>> remove the @override annotation and get successful
>> compilation. If there are any more errors send them to us.
>>
>> The method getEvalMethod(List<Class<?>>) of
>> type NumericOpMethodResolver must override a superclass
>> method
>> at
>> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
>> pMethodResolver.java:52)
>>
>>
>> ________________________________
>> From: Shyam Sarkar <sh...@yahoo.com>
>> Reply-To: <sh...@yahoo.com>
>> Date: Tue, 3 Feb 2009 16:51:47 -0800
>> To: <hi...@hadoop.apache.org>, Prasad Chakka
>> <pr...@facebook.com>
>> Subject: Re: Eclipse run fails !!
>>
>> Dear Prasad,
>>
>> I followed your instructions with 0.17.2.1 hadoop version
>> and changed jre to version 1.6_11. When I ran JUnit test, I
>> still got the following message :
>>
>> "Errors exist in required Project(s):
>> hive
>> Proceed with Launch ?"
>>
>> When I launched I got following errors ::
>> =================================== It is long
>> ======================
>> at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
>> Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> -1598728140.txt
>> Begin query: sample6.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff
>> /home/ssarkar/hive/build/ql/test/logs/positive/sample6.q.out
>> /home/ssarkar/hive/ql/src/test/results/compiler/parse/sample6.q.out
>> Exception: Unresolved compilation problem:
>> The method
>> getEvalMethod(List<Class<?>>) of type
>> NumericOpMethodResolver must override a superclass method
>>
>> java.lang.Error: Unresolved compilation problem:
>> The method
>> getEvalMethod(List<Class<?>>) of type
>> NumericOpMethodResolver must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
>> pMethodResolver.java:52)
>> at
>> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
>> .java:274)
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
>> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
>> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSamplePredicate(SemanticA
>> nalyzer.java:2872)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genTablePlan(SemanticAnalyze
>> r.java:2985)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3027)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
>> yzer.java:3229)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
>> at
>> org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample6(TestParse.java:10
>> 44)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 1450160017.txt
>> Begin query: sample7.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff
>> /home/ssarkar/hive/build/ql/test/logs/positive/sample7.q.out
>> /home/ssarkar/hive/ql/src/test/results/compiler/parse/sample7.q.out
>> Exception: Unresolved compilation problem:
>> The method
>> getEvalMethod(List<Class<?>>) of type
>> NumericOpMethodResolver must override a superclass method
>>
>> java.lang.Error: Unresolved compilation problem:
>> The method
>> getEvalMethod(List<Class<?>>) of type
>> NumericOpMethodResolver must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
>> pMethodResolver.java:52)
>> at
>> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
>> .java:274)
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
>> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
>> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSamplePredicate(SemanticA
>> nalyzer.java:2872)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genTablePlan(SemanticAnalyze
>> r.java:2985)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3027)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
>> yzer.java:3229)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
>> at
>> org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample7(TestParse.java:10
>> 70)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 514371634.txt
>> Begin query: subq.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff
>> /home/ssarkar/hive/build/ql/test/logs/positive/subq.q.out
>> /home/ssarkar/hive/ql/src/test/results/compiler/parse/subq.q.out
>> java.lang.Error: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> Exception: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
>> (TypeCheckProcFactory.java:163)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
>> tcher.java:80)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
>> java:83)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
>> :113)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
>> ker.java:95)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
>> yzer.java:3311)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
>> er.java:904)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
>> .java:2712)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3048)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3000)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3021)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
>> yzer.java:3229)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
>> at
>> org.apache.hadoop.hive.ql.parse.TestParse.testParse_subq(TestParse.java:1096)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 520907971.txt
>> Begin query: udf1.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff
>> /home/ssarkar/hive/build/ql/test/logs/positive/udf1.q.out
>> /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf1.q.out
>> Exception: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> java.lang.Error: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
>> (TypeCheckProcFactory.java:163)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
>> tcher.java:80)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
>> java:83)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
>> :113)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
>> ker.java:95)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
>> yzer.java:3311)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
>> er.java:904)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
>> .java:2712)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3048)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
>> yzer.java:3229)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
>> at
>> org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf1(TestParse.java:1122)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 656206857.txt
>> Begin query: udf4.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff
>> /home/ssarkar/hive/build/ql/test/logs/positive/udf4.q.out
>> /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf4.q.out
>> Exception: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>> java.lang.Error: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
>> (TypeCheckProcFactory.java:163)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
>> tcher.java:80)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
>> java:83)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
>> :113)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
>> ker.java:95)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
>> yzer.java:3311)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
>> er.java:1167)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
>> .java:2724)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3048)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
>> yzer.java:3229)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
>> at
>> org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf4(TestParse.java:1148)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>>
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 545867528.txt
>> Begin query: udf6.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff
>> /home/ssarkar/hive/build/ql/test/logs/positive/udf6.q.out
>> /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf6.q.out
>> Exception: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> java.lang.Error: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
>> (TypeCheckProcFactory.java:163)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
>> tcher.java:80)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
>> java:83)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
>> :113)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
>> ker.java:95)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
>> yzer.java:3311)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
>> er.java:1167)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
>> .java:2724)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3048)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
>> yzer.java:3229)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
>> at
>> org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf6(TestParse.java:1174)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> -1947338661.txt
>> Begin query: union.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff
>> /home/ssarkar/hive/build/ql/test/logs/positive/union.q.out
>> /home/ssarkar/hive/ql/src/test/results/compiler/parse/union.q.out
>> Exception: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> java.lang.Error: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
>> (TypeCheckProcFactory.java:163)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
>> tcher.java:80)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
>> java:83)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
>> :113)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
>> ker.java:95)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
>> yzer.java:3311)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
>> er.java:904)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
>> .java:2712)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3048)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3000)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3003)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3021)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
>> yzer.java:3229)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
>> at
>>
org.apache.hadoop.hive.ql.parse.TestParse.testParse_union(TestParse.java:1200>>
)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Table doesnotexist does not exist
>> Testing Filter Operator
>> java.lang.Error: Unresolved compilation problem:
>> The method
>> getEvalMethod(List<Class<?>>) of type
>> ComparisonOpMethodResolver must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.exec.ComparisonOpMethodResolver.getEvalMethod(Compa
>> risonOpMethodResolver.java:54)
>> at
>> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
>> .java:274)
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
>> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
>> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
>> at
>> org.apache.hadoop.hive.ql.exec.TestOperators.testBaseFilterOperator(TestOpera
>> tors.java:79)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Testing FileSink Operator
>> FileSink Operator ok
>> Testing Script Operator
>> [0] io.o=[1, 01]
>> [0]
>> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
>> ctor@acb988
>> [1] io.o=[2, 11]
>> [1]
>> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
>> ctor@acb988
>> [2] io.o=[3, 21]
>> [2]
>> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
>> ctor@acb988
>> [3] io.o=[4, 31]
>> [3]
>> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
>> ctor@acb988
>> [4] io.o=[5, 41]
>> [4]
>> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspe
>> ctor@acb988
>> Script Operator ok
>> Testing Map Operator
>> io1.o.toString() = [[0, 1, 2]]
>> io2.o.toString() = [[0, 1, 2]]
>> answer.toString() = [[0, 1, 2]]
>> io1.o.toString() = [[1, 2, 3]]
>> io2.o.toString() = [[1, 2, 3]]
>> answer.toString() = [[1, 2, 3]]
>> io1.o.toString() = [[2, 3, 4]]
>> io2.o.toString() = [[2, 3, 4]]
>> answer.toString() = [[2, 3, 4]]
>> io1.o.toString() = [[3, 4, 5]]
>> io2.o.toString() = [[3, 4, 5]]
>> answer.toString() = [[3, 4, 5]]
>> io1.o.toString() = [[4, 5, 6]]
>> io2.o.toString() = [[4, 5, 6]]
>> answer.toString() = [[4, 5, 6]]
>> Map Operator ok
>> JEXL library test ok
>> Evaluating 1 + 2 for 10000000 times
>> Evaluation finished: 0.562 seconds, 0.056 seconds/million
>> call.
>> Evaluating __udf__concat.evaluate("1",
>> "2") for 1000000 times
>> Evaluation finished: 1.028 seconds, 1.028 seconds/million
>> call.
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 1713747826.txt
>> java.io.FileNotFoundException: join1.q (No such file or
>> directory)
>> at java.io.FileInputStream.open(Native Method)
>> at java.io.FileInputStream.<init>(Unknown
>> Source)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.addFile(QTestUtil.java:188)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.queryListRunner(QTestUtil.java:751)
>> at
>> org.apache.hadoop.hive.ql.TestMTQueries.testMTQueries1(TestMTQueries.java:51)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> ExprNodeFuncEvaluator ok
>> ExprNodeColumnEvaluator ok
>> testExprNodeConversionEvaluator ok
>> java.lang.Error: Unresolved compilation problem:
>> The method
>> getEvalMethod(List<Class<?>>) of type
>> NumericOpMethodResolver must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericO
>> pMethodResolver.java:52)
>> at
>> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry
>> .java:274)
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
>> FuncExprNodeDesc(TypeCheckProcFactory.java:423)
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.get
>> FuncExprNodeDesc(TypeCheckProcFactory.java:379)
>> at
>> org.apache.hadoop.hive.ql.exec.TestExpressionEvaluator.testExprNodeSpeed(Test
>> ExpressionEvaluator.java:168)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> input struct = [234, [firstString, secondString],
>> {firstKey=1, secondKey=2}, -234, 1.0, -2.5]
>> Testing protocol:
>> org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
>> TypeName =
>> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
>> ble,nd:double}
>> bytes
>> =x01x80x00x00xeax01x80x00x00x02x01x66x69x72x73x74x53x74x72x69x6ex67x00x01x73x
>> 65x63x6fx6ex64x53x74x72x69x6ex67x00x01x80x00x00x02x01x66x69x72x73x74x4bx65x79
>> x00x01x80x00x00x01x01x73x65x63x6fx6ex64x4bx65x79x00x01x80x00x00x02x01x7fxffxf
>> fx16x01xbfxf0x00x00x00x00x00x00x01x3fxfbxffxffxffxffxffxff
>> o class = class java.util.ArrayList
>> o size = 6
>> o[0] class = class java.lang.Integer
>> o[1] class = class java.util.ArrayList
>> o[2] class = class java.util.HashMap
>> o = [234, [firstString, secondString], {firstKey=1,
>> secondKey=2}, -234, 1.0, -2.5]
>> Testing protocol:
>> org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
>> TypeName =
>> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
>> ble,nd:double}
>> bytes
>> =xfex7fxffxffx15xfex7fxffxffxfdxfex99x96x8dx8cx8bxacx8bx8dx96x91x98xffxfex8cx
>> 9ax9cx90x91x9bxacx8bx8dx96x91x98xffxfex7fxffxffxfdxfex99x96x8dx8cx8bxb4x9ax86
>> xffxfex7fxffxffxfexfex8cx9ax9cx90x91x9bxb4x9ax86xffxfex7fxffxffxfdxfex80x00x0
>> 0xe9xfex40x0fxffxffxffxffxffxffxfexc0x04x00x00x00x00x00x00
>> o class = class java.util.ArrayList
>> o size = 6
>> o[0] class = class java.lang.Integer
>> o[1] class = class java.util.ArrayList
>> o[2] class = class java.util.HashMap
>> o = [234, [firstString, secondString], {firstKey=1,
>> secondKey=2}, -234, 1.0, -2.5]
>> Testing protocol:
>> com.facebook.thrift.protocol.TBinaryProtocol
>> TypeName =
>> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
>> ble,nd:double}
>> bytes
>> =x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x
>> 74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08
>> x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x6
>> 5x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x
>> 00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
>> o class = class java.util.ArrayList
>> o size = 6
>> o[0] class = class java.lang.Integer
>> o[1] class = class java.util.ArrayList
>> o[2] class = class java.util.HashMap
>> o = [234, [firstString, secondString], {firstKey=1,
>> secondKey=2}, -234, 1.0, -2.5]
>> Testing protocol:
>> com.facebook.thrift.protocol.TJSONProtocol
>> TypeName =
>> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
>> ble,nd:double}
>> bytes
>> =x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x
>> 6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67
>> x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx2
>> 2x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x
>> 73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2c
>> x22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x6
>> 4x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x
>> 7dx7d
>> bytes in text
>> ={"-1":{"i32":234},"-2":{"lst":["str",2,"firstString","secondString"]},"-3":{
>> "map":["str","i32",2,{"firstKey":1,"secondKey":2}]},"-4":{"i32":-234},"-5":{"
>> dbl":1.0},"-6":{"dbl":-2.5}}
>> o class = class java.util.ArrayList
>> o size = 6
>> o[0] class = class java.lang.Integer
>> o[1] class = class java.util.ArrayList
>> o[2] class = class java.util.HashMap
>> o = [234, [firstString, secondString], {firstKey=1,
>> secondKey=2}, -234, 1.0, -2.5]
>> Testing protocol:
>> org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
>> TypeName =
>> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:dou
>> ble,nd:double}
>> bytes
>> =x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x
>> 69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32
>> x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
>> bytes in text
>> =234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
>> o class = class java.util.ArrayList
>> o size = 6
>> o[0] class = class java.lang.Integer
>> o[1] class = class java.util.ArrayList
>> o[2] class = class java.util.HashMap
>> o = [234, [firstString, secondString], {firstKey=1,
>> secondKey=2}, -234, 1.0, -2.5]
>> Beginning Test testTBinarySortableProtocol:
>> Testing struct test { double hello}
>> Testing struct test { i32 hello}
>> Testing struct test { i64 hello}
>> Testing struct test { string hello}
>> Testing struct test { string hello, double another}
>> Test testTBinarySortableProtocol passed!
>> bytes in text =234 firstStringsecondString
>> firstKey1secondKey2>
>> compare to =234 firstStringsecondString
>> firstKey1secondKey2>
>> o class = class java.util.ArrayList
>> o size = 3
>> o[0] class = class java.lang.Integer
>> o[1] class = class java.util.ArrayList
>> o[2] class = class java.util.HashMap
>> o = [234, [firstString, secondString], {firstKey=1,
>> secondKey=2}]
>> bytes in text =234 firstStringsecondString
>> firstKey1secondKey2>
>> compare to =234 firstStringsecondString
>> firstKey1secondKey2>
>> o class = class java.util.ArrayList
>> o size = 3
>> o = [234, null, {firstKey=1, secondKey=2}]
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 992344490.txt
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 1962723908.txt
>> OK
>> OK
>> Copying data from
>> file:/home/ssarkar/hive/data/files/kv1.txt
>> Loading data to table testhivedrivertable
>> OK
>> OK
>> OK
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 247426390.txt
>> Begin query: altern1.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>> /home/ssarkar/hive/build/ql/test/logs/clientnegative/altern1.q.out
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/altern1.q.out
>> Done query: altern1.q
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 587924093.txt
>> Begin query: bad_sample_clause.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>> /home/ssarkar/hive/build/ql/test/logs/clientnegative/bad_sample_clause.q.out
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/bad_sample_clause.q.out
>> Done query: bad_sample_clause.q
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> -1415770190.txt
>> Begin query: clusterbydistributeby.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>> /home/ssarkar/hive/build/ql/test/logs/clientnegative/clusterbydistributeby.q.
>> out
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/clusterbydistributeby.q
>> .out
>> Done query: clusterbydistributeby.q
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> -1882308680.txt
>> Begin query: clusterbysortby.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>> /home/ssarkar/hive/build/ql/test/logs/clientnegative/clusterbysortby.q.out
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/clusterbysortby.q.out
>> Done query: clusterbysortby.q
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 1579535074.txt
>> Begin query: clustern1.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> Exception: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> java.lang.Error: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
>> (TypeCheckProcFactory.java:163)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
>> tcher.java:80)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
>> java:83)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
>> :113)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
>> ker.java:95)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
>> yzer.java:3311)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
>> er.java:1167)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
>> .java:2724)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3048)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
>> yzer.java:3229)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
>> inSemanticAnalyzer.java:43)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
>> at
>> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
>> rn1(TestNegativeCliDriver.java:205)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> -430224382.txt
>> Begin query: clustern2.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> java.lang.Error: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
>> (TypeCheckProcFactory.java:163)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
>> tcher.java:80)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
>> java:83)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
>> :113)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
>> ker.java:95)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
>> yzer.java:3311)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinReduceSinkChild(Seman
>> ticAnalyzer.java:2332)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperator(SemanticAnal
>> yzer.java:2380)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinPlan(SemanticAnalyzer
>> .java:2444)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3041)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
>> yzer.java:3229)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
>> inSemanticAnalyzer.java:43)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
>> at
>> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
>> rn2(TestNegativeCliDriver.java:230)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Exception: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> -431481701.txt
>> Begin query: clustern3.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> Exception: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> java.lang.Error: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
>> (TypeCheckProcFactory.java:163)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
>> tcher.java:80)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
>> java:83)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
>> :113)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
>> ker.java:95)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
>> yzer.java:3311)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
>> er.java:1167)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
>> .java:2724)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3048)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
>> yzer.java:3229)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
>> inSemanticAnalyzer.java:43)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
>> at
>> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
>> rn3(TestNegativeCliDriver.java:255)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 1179496399.txt
>> Begin query: clustern4.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> java.lang.Error: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
>> (TypeCheckProcFactory.java:163)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
>> tcher.java:80)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
>> java:83)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
>> :113)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
>> ker.java:95)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
>> yzer.java:3311)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
>> er.java:1167)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
>> .java:2724)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3048)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
>> yzer.java:3229)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
>> inSemanticAnalyzer.java:43)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
>> at
>> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_cluste
>> rn4(TestNegativeCliDriver.java:280)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Exception: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 1998238474.txt
>> Begin query: describe_xpath1.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath1.q.out
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath1.q.out
>> Done query: describe_xpath1.q
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> -93672182.txt
>> Begin query: describe_xpath2.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath2.q.out
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath2.q.out
>> Done query: describe_xpath2.q
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 1401990633.txt
>> Begin query: describe_xpath3.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath3.q.out
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath3.q.out
>> Done query: describe_xpath3.q
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 659750364.txt
>> Begin query: describe_xpath4.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath4.q.out
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath4.q.out
>> Done query: describe_xpath4.q
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> -778063141.txt
>> Begin query: fileformat_bad_class.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>> /home/ssarkar/hive/build/ql/test/logs/clientnegative/fileformat_bad_class.q.o
>> ut
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/fileformat_bad_class.q.
>> out
>> Done query: fileformat_bad_class.q
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 1389054449.txt
>> Begin query: fileformat_void_input.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> Exception: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> java.lang.Error: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
>> (TypeCheckProcFactory.java:163)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
>> tcher.java:80)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
>> java:83)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
>> :113)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
>> ker.java:95)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
>> yzer.java:3311)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
>> er.java:904)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
>> .java:2712)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3048)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
>> yzer.java:3229)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
>> at
>> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_filefo
>> rmat_void_input(TestNegativeCliDriver.java:430)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 893718016.txt
>> Begin query: fileformat_void_output.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>> /home/ssarkar/hive/build/ql/test/logs/clientnegative/fileformat_void_output.q
>> .out
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/fileformat_void_output.
>> q.out
>> Done query: fileformat_void_output.q
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> -1795879737.txt
>> Begin query: input1.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>> /home/ssarkar/hive/build/ql/test/logs/clientnegative/input1.q.out
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/input1.q.out
>> Done query: input1.q
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 1786217678.txt
>> Begin query: input2.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> Exception: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> java.lang.Error: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
>> (TypeCheckProcFactory.java:163)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
>> tcher.java:80)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
>> java:83)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
>> :113)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
>> ker.java:95)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
>> yzer.java:3311)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
>> er.java:1167)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
>> .java:2724)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3048)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
>> yzer.java:3229)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
>> at
>> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input2
>> (TestNegativeCliDriver.java:505)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> -1429356131.txt
>> Begin query: input_testxpath4.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> java.lang.Error: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> Exception: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
>> (TypeCheckProcFactory.java:163)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
>> tcher.java:80)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
>> java:83)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
>> :113)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
>> ker.java:95)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
>> yzer.java:3311)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyz
>> er.java:1167)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
>> .java:2724)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3048)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
>> yzer.java:3229)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
>> inSemanticAnalyzer.java:43)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
>> at
>> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input_
>> testxpath4(TestNegativeCliDriver.java:530)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> -299734685.txt
>> Begin query: invalid_create_tbl1.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>>
/home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_create_tbl1.q.ou>>
t
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_create_tbl1.q.o
>> ut
>> Done query: invalid_create_tbl1.q
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> -3796110.txt
>> Begin query: invalid_create_tbl2.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>>
/home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_create_tbl2.q.ou>>
t
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_create_tbl2.q.o
>> ut
>> Done query: invalid_create_tbl2.q
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 732040395.txt
>> Begin query: invalid_select_expression.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_select_expressio
>> n.q.out
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_select_expressi
>> on.q.out
>> Done query: invalid_select_expression.q
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 764555300.txt
>> Begin query: invalid_tbl_name.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_tbl_name.q.out
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_tbl_name.q.out
>> Done query: invalid_tbl_name.q
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> -1388068500.txt
>> Begin query: joinneg.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>> /home/ssarkar/hive/build/ql/test/logs/clientnegative/joinneg.q.out
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/joinneg.q.out
>> Done query: joinneg.q
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> 1214860.txt
>> Begin query: load_wrong_fileformat.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>> /home/ssarkar/hive/build/ql/test/logs/clientnegative/load_wrong_fileformat.q.
>> out
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/load_wrong_fileformat.q
>> .out
>> Done query: load_wrong_fileformat.q
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> -1542677940.txt
>> Begin query: notable_alias3.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> Exception: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> java.lang.Error: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
>> (TypeCheckProcFactory.java:163)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
>> tcher.java:80)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
>> java:83)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
>> :113)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
>> ker.java:95)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
>> yzer.java:3311)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyz
>> er.java:904)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
>> .java:2712)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3048)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
>> yzer.java:3229)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
>> at
>> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notabl
>> e_alias3(TestNegativeCliDriver.java:705)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> -555682788.txt
>> Begin query: notable_alias4.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> Exception: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> java.lang.Error: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.StrExprProcessor
>> must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process
>> (TypeCheckProcFactory.java:163)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
>> tcher.java:80)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
>> java:83)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
>> :113)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
>> ker.java:95)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
>> yzer.java:3311)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinReduceSinkChild(Seman
>> ticAnalyzer.java:2332)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperator(SemanticAnal
>> yzer.java:2380)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinPlan(SemanticAnalyzer
>> .java:2444)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3041)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
>> yzer.java:3229)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
>> inSemanticAnalyzer.java:43)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
>> at
>> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notabl
>> e_alias4(TestNegativeCliDriver.java:730)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> -1604113442.txt
>> Begin query: strict_pruning.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> Exception: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.NumExprProcessor
>> must override a superclass method
>>
>> java.lang.Error: Unresolved compilation problem:
>> The method process(Node, NodeProcessorCtx,
>> Object...) of type TypeCheckProcFactory.NumExprProcessor
>> must override a superclass method
>>
>> at
>> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$NumExprProcessor.process
>> (TypeCheckProcFactory.java:121)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispa
>> tcher.java:80)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.
>> java:83)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java
>> :113)
>> at
>> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWal
>> ker.java:95)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnal
>> yzer.java:3311)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genGroupByPlanReduceSinkOper
>> ator(SemanticAnalyzer.java:1688)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genGroupByPlan2MR(SemanticAn
>> alyzer.java:1892)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer
>> .java:2721)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.jav
>> a:3048)
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnal
>> yzer.java:3229)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(Expla
>> inSemanticAnalyzer.java:43)
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnal
>> yzer.java:71)
>> at
>> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
>> at
>> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
>> at
>> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_strict
>> _pruning(TestNegativeCliDriver.java:755)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
>> Source)
>> at java.lang.reflect.Method.invoke(Unknown Source)
>> at
>> junit.framework.TestCase.runTest(TestCase.java:154)
>> at
>> junit.framework.TestCase.runBare(TestCase.java:127)
>> at
>> junit.framework.TestResult$1.protect(TestResult.java:106)
>> at
>> junit.framework.TestResult.runProtected(TestResult.java:124)
>> at
>> junit.framework.TestResult.run(TestResult.java:109)
>> at junit.framework.TestCase.run(TestCase.java:118)
>> at
>> junit.framework.TestSuite.runTest(TestSuite.java:208)
>> at
>> junit.framework.TestSuite.run(TestSuite.java:203)
>> at
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3Te
>> stReference.java:130)
>> at
>>
org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38>>
)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:460)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>> at
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> -327058962.txt
>> Begin query: subq_insert.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>> /home/ssarkar/hive/build/ql/test/logs/clientnegative/subq_insert.q.out
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/subq_insert.q.out
>> Done query: subq_insert.q
>> Hive history
>> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_
>> -196827093.txt
>> Begin query: union.q
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-08,
>> hr=12}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=11}
>> OK
>> Loading data to table srcpart partition {ds=2008-04-09,
>> hr=12}
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table srcbucket
>> OK
>> Loading data to table src
>> OK
>> diff -I \(file:\)\|\(/tmp/.*\)
>> /home/ssarkar/hive/build/ql/test/logs/clientnegative/union.q.out
>> /home/ssarkar/hive/ql/src/test/results/clientnegative/union.q.out
>> Done query: union.q
>> =====================================================================
>>
>> Thanks,
>> Shyam
>>
>>
>>
>>
>> --- On Tue, 2/3/09, Prasad Chakka
>> <pr...@facebook.com> wrote:
>>
>>> From: Prasad Chakka <pr...@facebook.com>
>>> Subject: Re: Eclipse run fails !!
>>> To: "hive-dev@hadoop.apache.org"
>> <hi...@hadoop.apache.org>,
>> "shyam_sarkar@yahoo.com"
>> <sh...@yahoo.com>
>>> Date: Tuesday, February 3, 2009, 2:51 PM
>>> I think there are multiple issues. Please do the
>> following
>>>
>>>
>>> 1. 'ant clean' in hive directory
>>> 2. delete project in eclipse
>>> 3. Don't change any config values in
>> hive-site.xml
>>> (revert your changes to fs.default.name etc) and
>> don't
>>> start HDFS cluster since in unit tests we are working
>> on
>>> local file system.
>>> 4. check what java version is 1.6
>>> 5. Follow the steps in the hive eclipse setup wiki
>> with
>>> -Dhadoop.version=0.17.2.1
>>> 6. Open Eclipse and import the project
>>> 7. Open project preferences and make sure that it is
>>> using java 6. If it is not then change it to use java6
>> (let
>>> me know if you need help here). If you change it then
>> make
>>> sure that you rebuild the project by doing a clean
>>> 8. Make sure that there are no compilation problems
>> for
>>> the hive project (check 'problems' tab in the
>> bottom
>>> panel of Eclipse)
>>> 9. Run the Junit test case. It should run without
>> any
>>> warning dialogs
>>>
>>> Let me know which of these steps fail and what is the
>>> error. You need not change any files run a junit
>> testcase.
>>> Once you are at this point, we can help you in setting
>> up
>>> command shell that talks to DFS.
>>>
>>> Prasad
>>>
>>> ________________________________
>>> From: Ashish Thusoo <at...@facebook.com>
>>> Reply-To: <hi...@hadoop.apache.org>
>>> Date: Tue, 3 Feb 2009 14:41:12 -0800
>>> To: <sh...@yahoo.com>,
>>> <hi...@hadoop.apache.org>
>>> Subject: RE: Eclipse run fails !!
>>>
>>> Actually for running hive through eclipse you
>> don't
>>> need to download and start hadoop. Hive tests
>> automatically
>>> create a local instance of hdfs and map/reduce and are
>> able
>>> to run it.
>>>
>>> The errors that you are getting seem to indicate some
>> jpox
>>> plugins missing in eclipse. Prasad is an expert in
>> that area
>>> and can perhaps comment on that...
>>>
>>> Ashish
>>>
>>> -----Original Message-----
>>> From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
>>> Sent: Tuesday, February 03, 2009 2:30 PM
>>> To: hive-dev@hadoop.apache.org; Ashish Thusoo
>>> Subject: RE: Eclipse run fails !!
>>>
>>> Dear Ashish,
>>>
>>> I downloaded hadoop 0.17.0 and tried bin/start-all.sh
>>> script. I got one error ::
>>>
>> ==============================================================
>>> [ssarkar@ayush2 hadoop-0.17.0]$ bin/start-all.sh
>> starting
>>> namenode, logging to
>>>
>> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-namenode-ayush2
>> .out
>>> ssarkar@localhost's password:
>>> localhost: starting datanode, logging to
>>>
>> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-datanode-ayush2
>> .out
>>> ssarkar@localhost's password:
>>> localhost: starting secondarynamenode, logging to
>>>
>> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-secondarynameno
>> de-ayush2.out
>>> localhost: Exception in thread "main"
>>> java.lang.NullPointerException
>>> localhost: at
>>>
>> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:119)
>>> localhost: at
>>>
>> org.apache.hadoop.dfs.SecondaryNameNode.<init>(SecondaryNameNode.java:118)
>>> localhost: at
>>>
>> org.apache.hadoop.dfs.SecondaryNameNode.main(SecondaryNameNode.java:495)
>>> starting jobtracker, logging to
>>>
>> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-jobtracker-ayus
>> h2.out
>>> ssarkar@localhost's password:
>>> localhost: starting tasktracker, logging to
>>>
>> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-tasktracker-ayu
>> sh2.out
>>> [ssarkar@ayush2 hadoop-0.17.0]$
>>>
>> ===================================================================
>>>
>>> Next I loaded hive project into eclipse following
>> steps in
>>> hive wiki.
>>> I tried Run->Run Configurations->JUnit and
>> selected
>>> TestTruncate to run but got the following error ::
>>>
>>> "Errors exist in required Projest(s):
>>>
>>> hive
>>>
>>> Proceed with Launch ?"
>>>
>>> When I launch I got following errors ::
>>>
>>>
>> =================================================================
>>> 09/02/03 14:01:33 INFO metastore.HiveMetaStore: 0:
>> Opening
>>> raw store with implemenation
>>> class:org.apache.hadoop.hive.metastore.ObjectStore
>>> 09/02/03 14:01:33 INFO metastore.ObjectStore:
>> ObjectStore,
>>> initialize called
>>> 09/02/03 14:01:33 INFO metastore.ObjectStore: found
>>> resource jpox.properties at
>>> file:/home/ssarkar/hive/conf/jpox.properties
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>> "org.eclipse.ui.views" not registered, but
>> plugin
>>> "org.eclipse.jdt.junit" defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>> "org.eclipse.ui.perspectiveExtensions" not
>>> registered, but plugin
>> "org.eclipse.jdt.junit"
>>> defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>> "org.eclipse.ui.preferencePages" not
>> registered,
>>> but plugin "org.eclipse.jdt.junit" defined
>> in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>> "org.eclipse.ui.keywords" not registered,
>> but
>>> plugin "org.eclipse.jdt.junit" defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>>
>> "org.eclipse.debug.core.launchConfigurationTypes"
>>> not registered, but plugin
>> "org.eclipse.jdt.junit"
>>> defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>>
>> "org.eclipse.debug.core.launchConfigurationComparators"
>>> not registered, but plugin
>> "org.eclipse.jdt.junit"
>>> defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>>
>> "org.eclipse.debug.ui.launchConfigurationTypeImages"
>>> not registered, but plugin
>> "org.eclipse.jdt.junit"
>>> defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>>
>> "org.eclipse.debug.ui.launchConfigurationTabGroups"
>>> not registered, but plugin
>> "org.eclipse.jdt.junit"
>>> defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>> "org.eclipse.ui.newWizards" not registered,
>> but
>>> plugin "org.eclipse.jdt.junit" defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>> "org.eclipse.ui.popupMenus" not registered,
>> but
>>> plugin "org.eclipse.jdt.junit" defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>> "org.eclipse.ui.actionSets" not registered,
>> but
>>> plugin "org.eclipse.jdt.junit" defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>> "org.eclipse.ui.actionSetPartAssociations"
>> not
>>> registered, but plugin
>> "org.eclipse.jdt.junit"
>>> defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>> "org.eclipse.debug.ui.launchShortcuts" not
>>> registered, but plugin
>> "org.eclipse.jdt.junit"
>>> defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>>
>> "org.eclipse.jdt.core.classpathVariableInitializer"
>>> not registered, but plugin
>> "org.eclipse.jdt.junit"
>>> defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>> "org.eclipse.jdt.ui.quickFixProcessors" not
>>> registered, but plugin
>> "org.eclipse.jdt.junit"
>>> defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>> "org.eclipse.jdt.ui.classpathFixProcessors"
>> not
>>> registered, but plugin
>> "org.eclipse.jdt.junit"
>>> defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>> "org.eclipse.ui.ide.markerResolution" not
>>> registered, but plugin
>> "org.eclipse.jdt.junit"
>>> defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>>
>> "org.eclipse.core.expressions.propertyTesters" not
>>> registered, but plugin
>> "org.eclipse.jdt.junit"
>>> defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>>
>> "org.eclipse.ltk.core.refactoring.renameParticipants"
>>> not registered, but plugin
>> "org.eclipse.jdt.junit"
>>> defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>> "org.eclipse.ui.commands" not registered,
>> but
>>> plugin "org.eclipse.jdt.junit" defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>> "org.eclipse.ui.bindings" not registered,
>> but
>>> plugin "org.eclipse.jdt.junit" defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>> "org.eclipse.core.runtime.preferences" not
>>> registered, but plugin
>> "org.eclipse.jdt.junit"
>>> defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>>
>> "org.eclipse.jdt.core.classpathContainerInitializer"
>>> not registered, but plugin
>> "org.eclipse.jdt.junit"
>>> defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
>>> "org.eclipse.jdt.ui.classpathContainerPage"
>> not
>>> registered, but plugin
>> "org.eclipse.jdt.junit"
>>> defined in
>>>
>> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.
>> osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
>>> refers to it.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.ui.ide" but it cannot be
>> resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.ui.views" but it cannot be
>> resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.jface.text" but it cannot be
>>> resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.ui.workbench.texteditor" but it
>>> cannot be resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.ui.editors" but it cannot be
>>> resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.ui" but it cannot be resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.core.expressions" but it cannot
>> be
>>> resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.core.resources" but it cannot
>> be
>>> resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.debug.core" but it cannot be
>>> resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.debug.ui" but it cannot be
>> resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.jdt.core" but it cannot be
>> resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.jdt.ui" but it cannot be
>> resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.core.runtime" but it cannot be
>>> resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.jdt.launching" but it cannot be
>>> resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.jdt.debug.ui" but it cannot be
>>> resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.compare" but it cannot be
>> resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.ltk.core.refactoring" but it
>> cannot
>>> be resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.core.variables" but it cannot
>> be
>>> resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit" requires
>>> "org.eclipse.ltk.ui.refactoring" but it
>> cannot be
>>> resolved.
>>> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
>>> "org.eclipse.jdt.junit.runtime" requires
>>> "org.junit" but it cannot be resolved.
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Bundle
>>> "org.jpox" has an optional dependency to
>>> "org.eclipse.equinox.registry" but it cannot
>> be
>>> resolved
>>> 09/02/03 14:01:33 WARN JPOX.Plugin: Bundle
>>> "org.jpox" has an optional dependency to
>>> "org.eclipse.core.runtime" but it cannot be
>>> resolved
>>> 09/02/03 14:01:33 INFO JPOX.Persistence:
>> =================
>>> Persistence Configuration ===============
>>> 09/02/03 14:01:33 INFO JPOX.Persistence: JPOX
>> Persistence
>>> Factory - Vendor: "JPOX" Version:
>>> "1.2.2"
>>> 09/02/03 14:01:33 INFO JPOX.Persistence: JPOX
>> Persistence
>>> Factory initialised for datastore
>>>
>> URL="jdbc:derby:;databaseName=../build/test/junit_metastore_db;create=true"
>>>
>> driver="org.apache.derby.jdbc.EmbeddedDriver"
>>> userName="APP"
>>> 09/02/03 14:01:33 INFO JPOX.Persistence:
>>>
>> ===========================================================
>>> 09/02/03 14:01:35 INFO Datastore.Schema: Initialising
>>> Catalog "", Schema "APP" using
>>> "SchemaTable" auto-start option
>>> 09/02/03 14:01:36 INFO Datastore.Schema: Catalog
>>> "", Schema "APP" initialised -
>> managing
>>> 0 classes
>>> 09/02/03 14:01:36 INFO JPOX.JDO: >> Found
>>> StoreManager org.jpox.store.rdbms.RDBMSManager
>>> java.lang.UnsupportedClassVersionError: Bad version
>> number
>>> in .class file
>>> at java.lang.ClassLoader.defineClass1(Native
>>> Method)
>>> at
>>>
>> java.lang.ClassLoader.defineClass(ClassLoader.java:620)
>>> at
>>>
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
>>> at
>>>
>> java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
>>> at
>>>
>> java.net.URLClassLoader.access$100(URLClassLoader.java:56)
>>> at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:195)
>>> at
>>> java.security.AccessController.doPrivileged(Native
>> Method)
>>> at
>>>
>> java.net.URLClassLoader.findClass(URLClassLoader.java:188)
>>> at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>> at
>>>
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
>>> at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:251)
>>> at
>>>
>> java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
>>> at
>>>
>> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:180)
>>> at
>>>
>> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStor
>> e.java:194)
>>> at
>>>
>> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:124)
>>> at
>>>
>> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:103)
>>> at
>>>
>> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:54)
>>> at
>>>
>> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:82)
>>> at
>>>
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore
>> .java:127)
>>> at
>>>
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(Hiv
>> eMetaStore.java:143)
>>> at
>>>
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.
>> java:115)
>>> at
>>>
>> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStor
>> e.java:100)
>>> at
>>>
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClie
>> nt.java:73)
>>> at
>>>
>> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:785)
>>> at
>>>
>> org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:798)
>>> at
>>>
>> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:316)
>>> at
>>>
>> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:300)
>>> at
>>>
>> org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:10
>> 5)
>>> at
>>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>> Method)
>>> at
>>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccess
>> orImpl.java:39)
>>> at
>>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruct
>> orAccessorImpl.java:27)
>>> at
>>>
>> java.lang.reflect.Constructor.newInstance(Constructor.java:494)
>>> at
>>>
>> junit.framework.TestSuite.createTest(TestSuite.java:131)
>>> at
>>>
>> junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
>>> at
>>>
>> junit.framework.TestSuite.<init>(TestSuite.java:75)
>>> at
>>>
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3T
>> estLoader.java:102)
>>> at
>>>
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit
>> 3TestLoader.java:59)
>>> at
>>>
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:445)
>>> at
>>>
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>>> at
>>>
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>>> at
>>>
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>>> java.lang.ExceptionInInitializerError
>>> at
>>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>>> Method)
>>> at
>>>
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccess
>> orImpl.java:39)
>>> at
>>>
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruct
>> orAccessorImpl.java:27)
>>> at
>>>
>> java.lang.reflect.Constructor.newInstance(Constructor.java:494)
>>> at
>>>
>> junit.framework.TestSuite.createTest(TestSuite.java:131)
>>> at
>>>
>> junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
>>> at
>>>
>> junit.framework.TestSuite.<init>(TestSuite.java:75)
>>> at
>>>
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3T
>> estLoader.java:102)
>>> at
>>>
>> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit
>> 3TestLoader.java:59)
>>> at
>>>
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:445)
>>> at
>>>
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRun
>> ner.java:673)
>>> at
>>>
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.j
>> ava:386)
>>> at
>>>
>> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.
>> java:196)
>>> Caused by: java.lang.RuntimeException: Encountered
>>> throwable
>>> at
>>>
>> org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:11
>> 3)
>>> ... 13 more
>>>
>> ======================================================================
>>>
>>> regards,
>>> Shyam
>>>
>>>
>>>
>>>
>>>
>>> --- On Tue, 2/3/09, Ashish Thusoo
>>> <at...@facebook.com> wrote:
>>>
>>>> From: Ashish Thusoo <at...@facebook.com>
>>>> Subject: RE: Eclipse run fails !!
>>>> To: "Shyam Sarkar"
>>> <sh...@yahoo.com>,
>>>> "hive-dev@hadoop.apache.org"
>>> <hi...@hadoop.apache.org>
>>>> Date: Tuesday, February 3, 2009, 1:46 PM Hi
>> Shyam,
>>>>
>>>> I can certainly say that 0.17.0 should work with
>>> eclipse. I have been
>>>> doing that for a while.
>>>>
>>>> Maybe we can concentrate on fixing why you are
>> not
>>> able to create a
>>>> table in hdfs. I am not sure why you could not
>> create
>>> the
>>>> /user/hive/warehouse directory in 0.17. Are you
>> saying
>>> that
>>>>
>>>> hadoop dfs -mkdir /user/facebook/hive
>>>>
>>>> does not work for you? Can you send out the
>> output
>>> when you run this
>>>> command.
>>>>
>>>> Ashish
>>>>
>>>> PS: using -Dhadoop.versoion="0.17.0"
>> for all
>>> the commands that are
>>>> given in the wiki should make things work in
>> eclipse.
>>>>
>>>> -----Original Message-----
>>>> From: Shyam Sarkar
>> [mailto:shyam_sarkar@yahoo.com]
>>>> Sent: Tuesday, February 03, 2009 12:00 PM
>>>> To: hive-dev@hadoop.apache.org; Ashish Thusoo
>>>> Subject: RE: Eclipse run fails !!
>>>>
>>>> Dear Ashish,
>>>>
>>>> For the last few days I tried eclipse 3.4.1 with
>>> 0.17.2.1 version and
>>>> got the same errors with run->run. Then I
>> looked
>>> into bin/hive command
>>>> and found that it could not create table in HDFS.
>> The
>>> reason was that
>>>> I could not create /user/hive/warehouse directory
>>> inside HDFS. It was
>>>> using Linux FS.
>>>> This is why I switched to 0.19.0 where
>> directories in
>>> HDFS can be
>>>> created.
>>>>
>>>> Could you please tell me which exact version of
>> hadoop
>>> will work fine
>>>> with eclipse runs ? I want to get rid of errors
>> in
>>> project itself
>>>> (before any run).
>>>>
>>>> Regards,
>>>> Shyam
>>>>
>>>> --- On Tue, 2/3/09, Ashish Thusoo
>>>> <at...@facebook.com> wrote:
>>>>
>>>>> From: Ashish Thusoo
>> <at...@facebook.com>
>>>>> Subject: RE: Eclipse run fails !!
>>>>> To: "hive-dev@hadoop.apache.org"
>>>> <hi...@hadoop.apache.org>,
>>>>> "shyam_sarkar@yahoo.com"
>>>> <sh...@yahoo.com>
>>>>> Date: Tuesday, February 3, 2009, 11:38 AM Hi
>>> Shyam,
>>>>>
>>>>> We have not really tried the eclipse stuff
>> for
>>> 0.19.0.
>>>> Is it possible
>>>>> for you to use 0.17.0 for now, while we
>> figure
>>> this
>>>> out...
>>>>>
>>>>> Ashish
>>>>>
>>>>> -----Original Message-----
>>>>> From: Shyam Sarkar
>>> [mailto:shyam_sarkar@yahoo.com]
>>>>> Sent: Tuesday, February 03, 2009 11:26 AM
>>>>> To: hive-dev@hadoop.apache.org
>>>>> Subject: Eclipse run fails !!
>>>>>
>>>>> Hello,
>>>>>
>>>>> I have hive project loaded inside eclipse
>> 3.4.1
>>> and
>>>> hadoop 0.19.0 is
>>>>> running in the background. I could create
>> tables
>>> from
>>>> bin/hive
>>>>> command.
>>>>> But when I try to run->run inside eclipse
>> it
>>> says::
>>>>>
>>>>> "Errors exist with required project(s):
>>>>>
>>>>> hive
>>>>>
>>>>> Proceed with launch ?"
>>>>>
>>>>> and then it gives many errors.
>>>>>
>>>>> Can someone please tell me why there are
>> errors
>>> in
>>>> project hive ? I
>>>>> followed all steps correctly from hive wiki.
>>>>>
>>>>> Regards,
>>>>> shyam_sarkar@yahoo.com
>
>
>
Re: Eclipse run fails !!
Posted by Shyam Sarkar <sh...@yahoo.com>.
Dear Prasad,
I did a clean and then performed build all for project hive. I am getting 10 errors and 1706 warnings. All errors are about "must override a superclass method". It seems to be compiler problem. I have added jre1.6.0_11 in build JRE. Why is the following problem coming ?
UDFMethodResolver is an interface::
public interface UDFMethodResolver {
public Method getEvalMethod(List<Class<?>> argClasses)
throws AmbiguousMethodException;
}
Following method should override above method ::
public Method getEvalMethod(List<Class<?>> argClasses)
throws AmbiguousMethodException {
assert(argClasses.size() == 2);
List<Class<?>> pClasses = null;
if (argClasses.get(0) == Void.class ||
argClasses.get(1) == Void.class) {
pClasses = new ArrayList<Class<?>>();
pClasses.add(Double.class);
pClasses.add(Double.class);
}
else if (argClasses.get(0) == argClasses.get(1)) {
pClasses = argClasses;
}
else if (argClasses.get(0) == java.sql.Date.class ||
argClasses.get(1) == java.sql.Date.class) {
pClasses = new ArrayList<Class<?>>();
pClasses.add(java.sql.Date.class);
pClasses.add(java.sql.Date.class);
}
else {
pClasses = new ArrayList<Class<?>>();
pClasses.add(Double.class);
pClasses.add(Double.class);
}
Method udfMethod = null;
for(Method m: Arrays.asList(udfClass.getMethods())) {
if (m.getName().equals("evaluate")) {
Class<?>[] argumentTypeInfos = m.getParameterTypes();
boolean match = (argumentTypeInfos.length == pClasses.size());
for(int i=0; i<pClasses.size() && match; i++) {
Class<?> accepted = ObjectInspectorUtils.generalizePrimitive(argumentTypeInfos[i]);
if (accepted != pClasses.get(i)) {
match = false;
}
}
if (match) {
if (udfMethod != null) {
throw new AmbiguousMethodException(udfClass, argClasses);
}
else {
udfMethod = m;
}
}
}
}
return udfMethod;
}
}
=====================Errors and Warnings=======================
Description Resource Path Location Type
The method add_partition(Partition) of type MetaStoreClient must override a superclass method MetaStoreClient.java hive/metastore/src/java/org/apache/hadoop/hive/metastore line 466 Java Problem
The method getEvalMethod(List<Class<?>>) of type ComparisonOpMethodResolver must override a superclass method ComparisonOpMethodResolver.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 54 Java Problem
The method getEvalMethod(List<Class<?>>) of type NumericOpMethodResolver must override a superclass method NumericOpMethodResolver.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 52 Java Problem
The method getEvalMethod(List<Class<?>>) of type UDFIf.UDFIfMethodResolver must override a superclass method UDFIf.java hive/ql/src/java/org/apache/hadoop/hive/ql/udf line 81 Java Problem
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.BoolExprProcessor must override a superclass method TypeCheckProcFactory.java hive/ql/src/java/org/apache/hadoop/hive/ql/parse line 205 Java Problem
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.ColumnExprProcessor must override a superclass method TypeCheckProcFactory.java hive/ql/src/java/org/apache/hadoop/hive/ql/parse line 245 Java Problem
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.DefaultExprProcessor must override a superclass method TypeCheckProcFactory.java hive/ql/src/java/org/apache/hadoop/hive/ql/parse line 584 Java Problem
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.NullExprProcessor must override a superclass method TypeCheckProcFactory.java hive/ql/src/java/org/apache/hadoop/hive/ql/parse line 94 Java Problem
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.NumExprProcessor must override a superclass method TypeCheckProcFactory.java hive/ql/src/java/org/apache/hadoop/hive/ql/parse line 121 Java Problem
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method TypeCheckProcFactory.java hive/ql/src/java/org/apache/hadoop/hive/ql/parse line 163 Java Problem
AbstractList is a raw type. References to generic type AbstractList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 361 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 131 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 135 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 139 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 143 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 143 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 234 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 306 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 307 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 370 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized JJTthrift_grammarState.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 13 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized JJTthrift_grammarState.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 14 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized OptionsProcessor.java hive/cli/src/java/org/apache/hadoop/hive/cli line 76 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized OptionsProcessor.java hive/cli/src/java/org/apache/hadoop/hive/cli line 76 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ScriptOperator.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 397 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized ScriptOperator.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 397 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized Utilities.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 249 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized Utilities.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 250 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized Utilities.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 250 Java Problem
ArrayList is a raw type. References to generic type ArrayList<E> should be parameterized thrift_grammar.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 2283 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ByteStreamTypedSerDe.java hive/serde/src/java/org/apache/hadoop/hive/serde line 70 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ColumnInfo.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 56 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 158 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 186 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 193 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 200 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 377 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 389 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 404 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ComplexSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 412 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ConstantTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 65 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ConstantTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 70 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ConstantTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 74 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ConstantTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 78 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeStructBase.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 50 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeBase.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 44 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeBool.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 70 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeDouble.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 67 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeList.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 40 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeMap.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 46 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeMap.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 48 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeMap.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 49 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeSet.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 49 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeSet.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 51 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypeString.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 45 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypei16.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 37 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypei32.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 65 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized DynamicSerDeTypei64.java hive/serde/src/java/org/apache/hadoop/hive/serde2/dynamic_type line 37 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized FetchTask.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 112 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized FetchTask.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 113 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized FetchTask.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 115 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized HiveInputFormat.java hive/ql/src/java/org/apache/hadoop/hive/ql/io line 148 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized HiveInputFormat.java hive/ql/src/java/org/apache/hadoop/hive/ql/io line 149 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized HiveInputFormat.java hive/ql/src/java/org/apache/hadoop/hive/ql/io line 151 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized HiveInputFormat.java hive/ql/src/java/org/apache/hadoop/hive/ql/io line 173 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized HiveInputFormat.java hive/ql/src/java/org/apache/hadoop/hive/ql/io line 209 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized JuteSerDe.java hive/serde/src/java/org/apache/hadoop/hive/serde/jute line 79 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized JuteSerDe.java hive/serde/src/java/org/apache/hadoop/hive/serde/jute line 96 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized MapOperator.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 96 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized MetadataTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 44 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized MetadataTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 93 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized MetadataTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 98 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized MetadataTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 102 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized MetadataTypedSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde/simple_meta line 106 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized PrimitiveTypeInfo.java hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo line 39 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized PrimitiveTypeInfo.java hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo line 52 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized PrimitiveTypeInfo.java hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo line 58 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized PrimitiveTypeInfo.java hive/ql/src/java/org/apache/hadoop/hive/ql/typeinfo line 66 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized RandomDimension.java hive/ql/src/java/org/apache/hadoop/hive/ql/metadata line 32 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 30 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 31 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 36 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 37 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 41 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 81 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 83 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 84 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 119 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 123 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 131 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ReflectionSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 139 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 68 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 73 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 78 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde line 83 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 33 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 33 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 35 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 43 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 85 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 122 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized SerDeUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 123 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized TReflectionUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 27 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized TReflectionUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 32 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized TReflectionUtils.java hive/serde/src/java/org/apache/hadoop/hive/serde line 45 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ThriftByteStreamTypedSerDe.java hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line 100 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ThriftByteStreamTypedSerDe.java hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line 124 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized ThriftSerDeField.java hive/serde/src/java/org/apache/hadoop/hive/serde/thrift line 33 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized UDFIf.java hive/ql/src/java/org/apache/hadoop/hive/ql/udf line 67 Java Problem
Class is a raw type. References to generic type Class<T> should be parameterized Utilities.java hive/ql/src/java/org/apache/hadoop/hive/ql/exec line 493 Java Problem
=======================================================================
Do you have any suggestion ?
Thanks,
Shyam
--- On Tue, 2/3/09, Prasad Chakka <pr...@facebook.com> wrote:
> From: Prasad Chakka <pr...@facebook.com>
> Subject: Re: Eclipse run fails !!
> To: "shyam_sarkar@yahoo.com" <sh...@yahoo.com>, "hive-dev@hadoop.apache.org" <hi...@hadoop.apache.org>
> Date: Tuesday, February 3, 2009, 4:57 PM
> There are compilation errors in Hive project so that is why
> running tests is causing issues. Could you send what are the
> compilation errors?
> One of the errors should be on following line. It most
> probably a Eclipse and java issue. You can most probably
> remove the @override annotation and get successful
> compilation. If there are any more errors send them to us.
>
> The method getEvalMethod(List<Class<?>>) of
> type NumericOpMethodResolver must override a superclass
> method
> at
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericOpMethodResolver.java:52)
>
>
> ________________________________
> From: Shyam Sarkar <sh...@yahoo.com>
> Reply-To: <sh...@yahoo.com>
> Date: Tue, 3 Feb 2009 16:51:47 -0800
> To: <hi...@hadoop.apache.org>, Prasad Chakka
> <pr...@facebook.com>
> Subject: Re: Eclipse run fails !!
>
> Dear Prasad,
>
> I followed your instructions with 0.17.2.1 hadoop version
> and changed jre to version 1.6_11. When I ran JUnit test, I
> still got the following message :
>
> "Errors exist in required Project(s):
> hive
> Proceed with Launch ?"
>
> When I launched I got following errors ::
> =================================== It is long
> ======================
> at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown
> Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1598728140.txt
> Begin query: sample6.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff
> /home/ssarkar/hive/build/ql/test/logs/positive/sample6.q.out
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/sample6.q.out
> Exception: Unresolved compilation problem:
> The method
> getEvalMethod(List<Class<?>>) of type
> NumericOpMethodResolver must override a superclass method
>
> java.lang.Error: Unresolved compilation problem:
> The method
> getEvalMethod(List<Class<?>>) of type
> NumericOpMethodResolver must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericOpMethodResolver.java:52)
> at
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry.java:274)
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:423)
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:379)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSamplePredicate(SemanticAnalyzer.java:2872)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genTablePlan(SemanticAnalyzer.java:2985)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3027)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> at
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample6(TestParse.java:1044)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1450160017.txt
> Begin query: sample7.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff
> /home/ssarkar/hive/build/ql/test/logs/positive/sample7.q.out
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/sample7.q.out
> Exception: Unresolved compilation problem:
> The method
> getEvalMethod(List<Class<?>>) of type
> NumericOpMethodResolver must override a superclass method
>
> java.lang.Error: Unresolved compilation problem:
> The method
> getEvalMethod(List<Class<?>>) of type
> NumericOpMethodResolver must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericOpMethodResolver.java:52)
> at
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry.java:274)
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:423)
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:379)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSamplePredicate(SemanticAnalyzer.java:2872)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genTablePlan(SemanticAnalyzer.java:2985)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3027)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> at
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample7(TestParse.java:1070)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_514371634.txt
> Begin query: subq.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff
> /home/ssarkar/hive/build/ql/test/logs/positive/subq.q.out
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/subq.q.out
> java.lang.Error: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> Exception: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
> at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyzer.java:904)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2712)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3000)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3021)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> at
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_subq(TestParse.java:1096)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_520907971.txt
> Begin query: udf1.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff
> /home/ssarkar/hive/build/ql/test/logs/positive/udf1.q.out
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf1.q.out
> Exception: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> java.lang.Error: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
> at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyzer.java:904)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2712)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> at
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf1(TestParse.java:1122)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_656206857.txt
> Begin query: udf4.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff
> /home/ssarkar/hive/build/ql/test/logs/positive/udf4.q.out
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf4.q.out
> Exception: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
> java.lang.Error: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
> at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> at
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf4(TestParse.java:1148)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
>
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_545867528.txt
> Begin query: udf6.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff
> /home/ssarkar/hive/build/ql/test/logs/positive/udf6.q.out
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf6.q.out
> Exception: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> java.lang.Error: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
> at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> at
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf6(TestParse.java:1174)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1947338661.txt
> Begin query: union.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff
> /home/ssarkar/hive/build/ql/test/logs/positive/union.q.out
> /home/ssarkar/hive/ql/src/test/results/compiler/parse/union.q.out
> Exception: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> java.lang.Error: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
> at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyzer.java:904)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2712)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3000)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3003)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3021)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
> at
> org.apache.hadoop.hive.ql.parse.TestParse.testParse_union(TestParse.java:1200)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Table doesnotexist does not exist
> Testing Filter Operator
> java.lang.Error: Unresolved compilation problem:
> The method
> getEvalMethod(List<Class<?>>) of type
> ComparisonOpMethodResolver must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.exec.ComparisonOpMethodResolver.getEvalMethod(ComparisonOpMethodResolver.java:54)
> at
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry.java:274)
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:423)
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:379)
> at
> org.apache.hadoop.hive.ql.exec.TestOperators.testBaseFilterOperator(TestOperators.java:79)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Testing FileSink Operator
> FileSink Operator ok
> Testing Script Operator
> [0] io.o=[1, 01]
> [0]
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@acb988
> [1] io.o=[2, 11]
> [1]
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@acb988
> [2] io.o=[3, 21]
> [2]
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@acb988
> [3] io.o=[4, 31]
> [3]
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@acb988
> [4] io.o=[5, 41]
> [4]
> io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@acb988
> Script Operator ok
> Testing Map Operator
> io1.o.toString() = [[0, 1, 2]]
> io2.o.toString() = [[0, 1, 2]]
> answer.toString() = [[0, 1, 2]]
> io1.o.toString() = [[1, 2, 3]]
> io2.o.toString() = [[1, 2, 3]]
> answer.toString() = [[1, 2, 3]]
> io1.o.toString() = [[2, 3, 4]]
> io2.o.toString() = [[2, 3, 4]]
> answer.toString() = [[2, 3, 4]]
> io1.o.toString() = [[3, 4, 5]]
> io2.o.toString() = [[3, 4, 5]]
> answer.toString() = [[3, 4, 5]]
> io1.o.toString() = [[4, 5, 6]]
> io2.o.toString() = [[4, 5, 6]]
> answer.toString() = [[4, 5, 6]]
> Map Operator ok
> JEXL library test ok
> Evaluating 1 + 2 for 10000000 times
> Evaluation finished: 0.562 seconds, 0.056 seconds/million
> call.
> Evaluating __udf__concat.evaluate("1",
> "2") for 1000000 times
> Evaluation finished: 1.028 seconds, 1.028 seconds/million
> call.
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1713747826.txt
> java.io.FileNotFoundException: join1.q (No such file or
> directory)
> at java.io.FileInputStream.open(Native Method)
> at java.io.FileInputStream.<init>(Unknown
> Source)
> at
> org.apache.hadoop.hive.ql.QTestUtil.addFile(QTestUtil.java:188)
> at
> org.apache.hadoop.hive.ql.QTestUtil.queryListRunner(QTestUtil.java:751)
> at
> org.apache.hadoop.hive.ql.TestMTQueries.testMTQueries1(TestMTQueries.java:51)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> ExprNodeFuncEvaluator ok
> ExprNodeColumnEvaluator ok
> testExprNodeConversionEvaluator ok
> java.lang.Error: Unresolved compilation problem:
> The method
> getEvalMethod(List<Class<?>>) of type
> NumericOpMethodResolver must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericOpMethodResolver.java:52)
> at
> org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry.java:274)
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:423)
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:379)
> at
> org.apache.hadoop.hive.ql.exec.TestExpressionEvaluator.testExprNodeSpeed(TestExpressionEvaluator.java:168)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> input struct = [234, [firstString, secondString],
> {firstKey=1, secondKey=2}, -234, 1.0, -2.5]
> Testing protocol:
> org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
> TypeName =
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:double,nd:double}
> bytes
> =x01x80x00x00xeax01x80x00x00x02x01x66x69x72x73x74x53x74x72x69x6ex67x00x01x73x65x63x6fx6ex64x53x74x72x69x6ex67x00x01x80x00x00x02x01x66x69x72x73x74x4bx65x79x00x01x80x00x00x01x01x73x65x63x6fx6ex64x4bx65x79x00x01x80x00x00x02x01x7fxffxffx16x01xbfxf0x00x00x00x00x00x00x01x3fxfbxffxffxffxffxffxff
> o class = class java.util.ArrayList
> o size = 6
> o[0] class = class java.lang.Integer
> o[1] class = class java.util.ArrayList
> o[2] class = class java.util.HashMap
> o = [234, [firstString, secondString], {firstKey=1,
> secondKey=2}, -234, 1.0, -2.5]
> Testing protocol:
> org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
> TypeName =
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:double,nd:double}
> bytes
> =xfex7fxffxffx15xfex7fxffxffxfdxfex99x96x8dx8cx8bxacx8bx8dx96x91x98xffxfex8cx9ax9cx90x91x9bxacx8bx8dx96x91x98xffxfex7fxffxffxfdxfex99x96x8dx8cx8bxb4x9ax86xffxfex7fxffxffxfexfex8cx9ax9cx90x91x9bxb4x9ax86xffxfex7fxffxffxfdxfex80x00x00xe9xfex40x0fxffxffxffxffxffxffxfexc0x04x00x00x00x00x00x00
> o class = class java.util.ArrayList
> o size = 6
> o[0] class = class java.lang.Integer
> o[1] class = class java.util.ArrayList
> o[2] class = class java.util.HashMap
> o = [234, [firstString, secondString], {firstKey=1,
> secondKey=2}, -234, 1.0, -2.5]
> Testing protocol:
> com.facebook.thrift.protocol.TBinaryProtocol
> TypeName =
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:double,nd:double}
> bytes
> =x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x65x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
> o class = class java.util.ArrayList
> o size = 6
> o[0] class = class java.lang.Integer
> o[1] class = class java.util.ArrayList
> o[2] class = class java.util.HashMap
> o = [234, [firstString, secondString], {firstKey=1,
> secondKey=2}, -234, 1.0, -2.5]
> Testing protocol:
> com.facebook.thrift.protocol.TJSONProtocol
> TypeName =
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:double,nd:double}
> bytes
> =x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx22x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2cx22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x64x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x7dx7d
> bytes in text
> ={"-1":{"i32":234},"-2":{"lst":["str",2,"firstString","secondString"]},"-3":{"map":["str","i32",2,{"firstKey":1,"secondKey":2}]},"-4":{"i32":-234},"-5":{"dbl":1.0},"-6":{"dbl":-2.5}}
> o class = class java.util.ArrayList
> o size = 6
> o[0] class = class java.lang.Integer
> o[1] class = class java.util.ArrayList
> o[2] class = class java.util.HashMap
> o = [234, [firstString, secondString], {firstKey=1,
> secondKey=2}, -234, 1.0, -2.5]
> Testing protocol:
> org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
> TypeName =
> struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:double,nd:double}
> bytes
> =x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
> bytes in text
> =234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
> o class = class java.util.ArrayList
> o size = 6
> o[0] class = class java.lang.Integer
> o[1] class = class java.util.ArrayList
> o[2] class = class java.util.HashMap
> o = [234, [firstString, secondString], {firstKey=1,
> secondKey=2}, -234, 1.0, -2.5]
> Beginning Test testTBinarySortableProtocol:
> Testing struct test { double hello}
> Testing struct test { i32 hello}
> Testing struct test { i64 hello}
> Testing struct test { string hello}
> Testing struct test { string hello, double another}
> Test testTBinarySortableProtocol passed!
> bytes in text =234 firstStringsecondString
> firstKey1secondKey2>
> compare to =234 firstStringsecondString
> firstKey1secondKey2>
> o class = class java.util.ArrayList
> o size = 3
> o[0] class = class java.lang.Integer
> o[1] class = class java.util.ArrayList
> o[2] class = class java.util.HashMap
> o = [234, [firstString, secondString], {firstKey=1,
> secondKey=2}]
> bytes in text =234 firstStringsecondString
> firstKey1secondKey2>
> compare to =234 firstStringsecondString
> firstKey1secondKey2>
> o class = class java.util.ArrayList
> o size = 3
> o = [234, null, {firstKey=1, secondKey=2}]
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_992344490.txt
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1962723908.txt
> OK
> OK
> Copying data from
> file:/home/ssarkar/hive/data/files/kv1.txt
> Loading data to table testhivedrivertable
> OK
> OK
> OK
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_247426390.txt
> Begin query: altern1.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/altern1.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/altern1.q.out
> Done query: altern1.q
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_587924093.txt
> Begin query: bad_sample_clause.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/bad_sample_clause.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/bad_sample_clause.q.out
> Done query: bad_sample_clause.q
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1415770190.txt
> Begin query: clusterbydistributeby.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/clusterbydistributeby.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/clusterbydistributeby.q.out
> Done query: clusterbydistributeby.q
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1882308680.txt
> Begin query: clusterbysortby.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/clusterbysortby.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/clusterbysortby.q.out
> Done query: clusterbysortby.q
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1579535074.txt
> Begin query: clustern1.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> Exception: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> java.lang.Error: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
> at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> at
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> at
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clustern1(TestNegativeCliDriver.java:205)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-430224382.txt
> Begin query: clustern2.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> java.lang.Error: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
> at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinReduceSinkChild(SemanticAnalyzer.java:2332)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperator(SemanticAnalyzer.java:2380)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinPlan(SemanticAnalyzer.java:2444)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3041)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> at
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> at
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clustern2(TestNegativeCliDriver.java:230)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Exception: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-431481701.txt
> Begin query: clustern3.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> Exception: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> java.lang.Error: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
> at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> at
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> at
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clustern3(TestNegativeCliDriver.java:255)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1179496399.txt
> Begin query: clustern4.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> java.lang.Error: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
> at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> at
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> at
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clustern4(TestNegativeCliDriver.java:280)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Exception: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1998238474.txt
> Begin query: describe_xpath1.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath1.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath1.q.out
> Done query: describe_xpath1.q
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-93672182.txt
> Begin query: describe_xpath2.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath2.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath2.q.out
> Done query: describe_xpath2.q
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1401990633.txt
> Begin query: describe_xpath3.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath3.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath3.q.out
> Done query: describe_xpath3.q
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_659750364.txt
> Begin query: describe_xpath4.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath4.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath4.q.out
> Done query: describe_xpath4.q
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-778063141.txt
> Begin query: fileformat_bad_class.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/fileformat_bad_class.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/fileformat_bad_class.q.out
> Done query: fileformat_bad_class.q
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1389054449.txt
> Begin query: fileformat_void_input.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> Exception: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> java.lang.Error: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
> at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyzer.java:904)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2712)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> at
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> at
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_fileformat_void_input(TestNegativeCliDriver.java:430)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_893718016.txt
> Begin query: fileformat_void_output.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/fileformat_void_output.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/fileformat_void_output.q.out
> Done query: fileformat_void_output.q
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1795879737.txt
> Begin query: input1.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/input1.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/input1.q.out
> Done query: input1.q
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1786217678.txt
> Begin query: input2.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> Exception: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> java.lang.Error: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
> at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> at
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> at
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input2(TestNegativeCliDriver.java:505)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1429356131.txt
> Begin query: input_testxpath4.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> java.lang.Error: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> Exception: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
> at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> at
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> at
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input_testxpath4(TestNegativeCliDriver.java:530)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-299734685.txt
> Begin query: invalid_create_tbl1.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_create_tbl1.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_create_tbl1.q.out
> Done query: invalid_create_tbl1.q
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-3796110.txt
> Begin query: invalid_create_tbl2.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_create_tbl2.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_create_tbl2.q.out
> Done query: invalid_create_tbl2.q
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_732040395.txt
> Begin query: invalid_select_expression.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_select_expression.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_select_expression.q.out
> Done query: invalid_select_expression.q
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_764555300.txt
> Begin query: invalid_tbl_name.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_tbl_name.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_tbl_name.q.out
> Done query: invalid_tbl_name.q
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1388068500.txt
> Begin query: joinneg.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/joinneg.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/joinneg.q.out
> Done query: joinneg.q
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1214860.txt
> Begin query: load_wrong_fileformat.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/load_wrong_fileformat.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/load_wrong_fileformat.q.out
> Done query: load_wrong_fileformat.q
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1542677940.txt
> Begin query: notable_alias3.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> Exception: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> java.lang.Error: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
> at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyzer.java:904)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2712)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> at
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> at
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notable_alias3(TestNegativeCliDriver.java:705)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-555682788.txt
> Begin query: notable_alias4.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> Exception: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> java.lang.Error: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.StrExprProcessor
> must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
> at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinReduceSinkChild(SemanticAnalyzer.java:2332)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperator(SemanticAnalyzer.java:2380)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinPlan(SemanticAnalyzer.java:2444)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3041)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> at
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> at
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notable_alias4(TestNegativeCliDriver.java:730)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1604113442.txt
> Begin query: strict_pruning.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> Exception: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.NumExprProcessor
> must override a superclass method
>
> java.lang.Error: Unresolved compilation problem:
> The method process(Node, NodeProcessorCtx,
> Object...) of type TypeCheckProcFactory.NumExprProcessor
> must override a superclass method
>
> at
> org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$NumExprProcessor.process(TypeCheckProcFactory.java:121)
> at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
> at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genGroupByPlanReduceSinkOperator(SemanticAnalyzer.java:1688)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genGroupByPlan2MR(SemanticAnalyzer.java:1892)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2721)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
> at
> org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
> at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
> at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
> at
> org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
> at
> org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_strict_pruning(TestNegativeCliDriver.java:755)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown
> Source)
> at java.lang.reflect.Method.invoke(Unknown Source)
> at
> junit.framework.TestCase.runTest(TestCase.java:154)
> at
> junit.framework.TestCase.runBare(TestCase.java:127)
> at
> junit.framework.TestResult$1.protect(TestResult.java:106)
> at
> junit.framework.TestResult.runProtected(TestResult.java:124)
> at
> junit.framework.TestResult.run(TestResult.java:109)
> at junit.framework.TestCase.run(TestCase.java:118)
> at
> junit.framework.TestSuite.runTest(TestSuite.java:208)
> at
> junit.framework.TestSuite.run(TestSuite.java:203)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
> at
> org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-327058962.txt
> Begin query: subq_insert.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/subq_insert.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/subq_insert.q.out
> Done query: subq_insert.q
> Hive history
> file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-196827093.txt
> Begin query: union.q
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-08,
> hr=12}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=11}
> OK
> Loading data to table srcpart partition {ds=2008-04-09,
> hr=12}
> OK
> Loading data to table srcbucket
> OK
> Loading data to table srcbucket
> OK
> Loading data to table src
> OK
> diff -I \(file:\)\|\(/tmp/.*\)
> /home/ssarkar/hive/build/ql/test/logs/clientnegative/union.q.out
> /home/ssarkar/hive/ql/src/test/results/clientnegative/union.q.out
> Done query: union.q
> =====================================================================
>
> Thanks,
> Shyam
>
>
>
>
> --- On Tue, 2/3/09, Prasad Chakka
> <pr...@facebook.com> wrote:
>
> > From: Prasad Chakka <pr...@facebook.com>
> > Subject: Re: Eclipse run fails !!
> > To: "hive-dev@hadoop.apache.org"
> <hi...@hadoop.apache.org>,
> "shyam_sarkar@yahoo.com"
> <sh...@yahoo.com>
> > Date: Tuesday, February 3, 2009, 2:51 PM
> > I think there are multiple issues. Please do the
> following
> >
> >
> > 1. 'ant clean' in hive directory
> > 2. delete project in eclipse
> > 3. Don't change any config values in
> hive-site.xml
> > (revert your changes to fs.default.name etc) and
> don't
> > start HDFS cluster since in unit tests we are working
> on
> > local file system.
> > 4. check what java version is 1.6
> > 5. Follow the steps in the hive eclipse setup wiki
> with
> > -Dhadoop.version=0.17.2.1
> > 6. Open Eclipse and import the project
> > 7. Open project preferences and make sure that it is
> > using java 6. If it is not then change it to use java6
> (let
> > me know if you need help here). If you change it then
> make
> > sure that you rebuild the project by doing a clean
> > 8. Make sure that there are no compilation problems
> for
> > the hive project (check 'problems' tab in the
> bottom
> > panel of Eclipse)
> > 9. Run the Junit test case. It should run without
> any
> > warning dialogs
> >
> > Let me know which of these steps fail and what is the
> > error. You need not change any files run a junit
> testcase.
> > Once you are at this point, we can help you in setting
> up
> > command shell that talks to DFS.
> >
> > Prasad
> >
> > ________________________________
> > From: Ashish Thusoo <at...@facebook.com>
> > Reply-To: <hi...@hadoop.apache.org>
> > Date: Tue, 3 Feb 2009 14:41:12 -0800
> > To: <sh...@yahoo.com>,
> > <hi...@hadoop.apache.org>
> > Subject: RE: Eclipse run fails !!
> >
> > Actually for running hive through eclipse you
> don't
> > need to download and start hadoop. Hive tests
> automatically
> > create a local instance of hdfs and map/reduce and are
> able
> > to run it.
> >
> > The errors that you are getting seem to indicate some
> jpox
> > plugins missing in eclipse. Prasad is an expert in
> that area
> > and can perhaps comment on that...
> >
> > Ashish
> >
> > -----Original Message-----
> > From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
> > Sent: Tuesday, February 03, 2009 2:30 PM
> > To: hive-dev@hadoop.apache.org; Ashish Thusoo
> > Subject: RE: Eclipse run fails !!
> >
> > Dear Ashish,
> >
> > I downloaded hadoop 0.17.0 and tried bin/start-all.sh
> > script. I got one error ::
> >
> ==============================================================
> > [ssarkar@ayush2 hadoop-0.17.0]$ bin/start-all.sh
> starting
> > namenode, logging to
> >
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-namenode-ayush2.out
> > ssarkar@localhost's password:
> > localhost: starting datanode, logging to
> >
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-datanode-ayush2.out
> > ssarkar@localhost's password:
> > localhost: starting secondarynamenode, logging to
> >
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-secondarynamenode-ayush2.out
> > localhost: Exception in thread "main"
> > java.lang.NullPointerException
> > localhost: at
> >
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:119)
> > localhost: at
> >
> org.apache.hadoop.dfs.SecondaryNameNode.<init>(SecondaryNameNode.java:118)
> > localhost: at
> >
> org.apache.hadoop.dfs.SecondaryNameNode.main(SecondaryNameNode.java:495)
> > starting jobtracker, logging to
> >
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-jobtracker-ayush2.out
> > ssarkar@localhost's password:
> > localhost: starting tasktracker, logging to
> >
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-tasktracker-ayush2.out
> > [ssarkar@ayush2 hadoop-0.17.0]$
> >
> ===================================================================
> >
> > Next I loaded hive project into eclipse following
> steps in
> > hive wiki.
> > I tried Run->Run Configurations->JUnit and
> selected
> > TestTruncate to run but got the following error ::
> >
> > "Errors exist in required Projest(s):
> >
> > hive
> >
> > Proceed with Launch ?"
> >
> > When I launch I got following errors ::
> >
> >
> =================================================================
> > 09/02/03 14:01:33 INFO metastore.HiveMetaStore: 0:
> Opening
> > raw store with implemenation
> > class:org.apache.hadoop.hive.metastore.ObjectStore
> > 09/02/03 14:01:33 INFO metastore.ObjectStore:
> ObjectStore,
> > initialize called
> > 09/02/03 14:01:33 INFO metastore.ObjectStore: found
> > resource jpox.properties at
> > file:/home/ssarkar/hive/conf/jpox.properties
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> > "org.eclipse.ui.views" not registered, but
> plugin
> > "org.eclipse.jdt.junit" defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> > "org.eclipse.ui.perspectiveExtensions" not
> > registered, but plugin
> "org.eclipse.jdt.junit"
> > defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> > "org.eclipse.ui.preferencePages" not
> registered,
> > but plugin "org.eclipse.jdt.junit" defined
> in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> > "org.eclipse.ui.keywords" not registered,
> but
> > plugin "org.eclipse.jdt.junit" defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> >
> "org.eclipse.debug.core.launchConfigurationTypes"
> > not registered, but plugin
> "org.eclipse.jdt.junit"
> > defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> >
> "org.eclipse.debug.core.launchConfigurationComparators"
> > not registered, but plugin
> "org.eclipse.jdt.junit"
> > defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> >
> "org.eclipse.debug.ui.launchConfigurationTypeImages"
> > not registered, but plugin
> "org.eclipse.jdt.junit"
> > defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> >
> "org.eclipse.debug.ui.launchConfigurationTabGroups"
> > not registered, but plugin
> "org.eclipse.jdt.junit"
> > defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> > "org.eclipse.ui.newWizards" not registered,
> but
> > plugin "org.eclipse.jdt.junit" defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> > "org.eclipse.ui.popupMenus" not registered,
> but
> > plugin "org.eclipse.jdt.junit" defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> > "org.eclipse.ui.actionSets" not registered,
> but
> > plugin "org.eclipse.jdt.junit" defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> > "org.eclipse.ui.actionSetPartAssociations"
> not
> > registered, but plugin
> "org.eclipse.jdt.junit"
> > defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> > "org.eclipse.debug.ui.launchShortcuts" not
> > registered, but plugin
> "org.eclipse.jdt.junit"
> > defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> >
> "org.eclipse.jdt.core.classpathVariableInitializer"
> > not registered, but plugin
> "org.eclipse.jdt.junit"
> > defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> > "org.eclipse.jdt.ui.quickFixProcessors" not
> > registered, but plugin
> "org.eclipse.jdt.junit"
> > defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> > "org.eclipse.jdt.ui.classpathFixProcessors"
> not
> > registered, but plugin
> "org.eclipse.jdt.junit"
> > defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> > "org.eclipse.ui.ide.markerResolution" not
> > registered, but plugin
> "org.eclipse.jdt.junit"
> > defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> >
> "org.eclipse.core.expressions.propertyTesters" not
> > registered, but plugin
> "org.eclipse.jdt.junit"
> > defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> >
> "org.eclipse.ltk.core.refactoring.renameParticipants"
> > not registered, but plugin
> "org.eclipse.jdt.junit"
> > defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> > "org.eclipse.ui.commands" not registered,
> but
> > plugin "org.eclipse.jdt.junit" defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> > "org.eclipse.ui.bindings" not registered,
> but
> > plugin "org.eclipse.jdt.junit" defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> > "org.eclipse.core.runtime.preferences" not
> > registered, but plugin
> "org.eclipse.jdt.junit"
> > defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> >
> "org.eclipse.jdt.core.classpathContainerInitializer"
> > not registered, but plugin
> "org.eclipse.jdt.junit"
> > defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> > "org.eclipse.jdt.ui.classpathContainerPage"
> not
> > registered, but plugin
> "org.eclipse.jdt.junit"
> > defined in
> >
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> > refers to it.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.ui.ide" but it cannot be
> resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.ui.views" but it cannot be
> resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.jface.text" but it cannot be
> > resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.ui.workbench.texteditor" but it
> > cannot be resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.ui.editors" but it cannot be
> > resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.ui" but it cannot be resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.core.expressions" but it cannot
> be
> > resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.core.resources" but it cannot
> be
> > resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.debug.core" but it cannot be
> > resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.debug.ui" but it cannot be
> resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.jdt.core" but it cannot be
> resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.jdt.ui" but it cannot be
> resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.core.runtime" but it cannot be
> > resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.jdt.launching" but it cannot be
> > resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.jdt.debug.ui" but it cannot be
> > resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.compare" but it cannot be
> resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.ltk.core.refactoring" but it
> cannot
> > be resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.core.variables" but it cannot
> be
> > resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit" requires
> > "org.eclipse.ltk.ui.refactoring" but it
> cannot be
> > resolved.
> > 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> > "org.eclipse.jdt.junit.runtime" requires
> > "org.junit" but it cannot be resolved.
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Bundle
> > "org.jpox" has an optional dependency to
> > "org.eclipse.equinox.registry" but it cannot
> be
> > resolved
> > 09/02/03 14:01:33 WARN JPOX.Plugin: Bundle
> > "org.jpox" has an optional dependency to
> > "org.eclipse.core.runtime" but it cannot be
> > resolved
> > 09/02/03 14:01:33 INFO JPOX.Persistence:
> =================
> > Persistence Configuration ===============
> > 09/02/03 14:01:33 INFO JPOX.Persistence: JPOX
> Persistence
> > Factory - Vendor: "JPOX" Version:
> > "1.2.2"
> > 09/02/03 14:01:33 INFO JPOX.Persistence: JPOX
> Persistence
> > Factory initialised for datastore
> >
> URL="jdbc:derby:;databaseName=../build/test/junit_metastore_db;create=true"
> >
> driver="org.apache.derby.jdbc.EmbeddedDriver"
> > userName="APP"
> > 09/02/03 14:01:33 INFO JPOX.Persistence:
> >
> ===========================================================
> > 09/02/03 14:01:35 INFO Datastore.Schema: Initialising
> > Catalog "", Schema "APP" using
> > "SchemaTable" auto-start option
> > 09/02/03 14:01:36 INFO Datastore.Schema: Catalog
> > "", Schema "APP" initialised -
> managing
> > 0 classes
> > 09/02/03 14:01:36 INFO JPOX.JDO: >> Found
> > StoreManager org.jpox.store.rdbms.RDBMSManager
> > java.lang.UnsupportedClassVersionError: Bad version
> number
> > in .class file
> > at java.lang.ClassLoader.defineClass1(Native
> > Method)
> > at
> >
> java.lang.ClassLoader.defineClass(ClassLoader.java:620)
> > at
> >
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
> > at
> >
> java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
> > at
> >
> java.net.URLClassLoader.access$100(URLClassLoader.java:56)
> > at
> > java.net.URLClassLoader$1.run(URLClassLoader.java:195)
> > at
> > java.security.AccessController.doPrivileged(Native
> Method)
> > at
> >
> java.net.URLClassLoader.findClass(URLClassLoader.java:188)
> > at
> > java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > at
> >
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
> > at
> > java.lang.ClassLoader.loadClass(ClassLoader.java:251)
> > at
> >
> java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
> > at
> >
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:180)
> > at
> >
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:194)
> > at
> >
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:124)
> > at
> >
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:103)
> > at
> >
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:54)
> > at
> >
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:82)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:127)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:143)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:115)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:100)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:73)
> > at
> >
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:785)
> > at
> >
> org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:798)
> > at
> >
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:316)
> > at
> >
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:300)
> > at
> >
> org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:105)
> > at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > Method)
> > at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> > at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> > at
> >
> java.lang.reflect.Constructor.newInstance(Constructor.java:494)
> > at
> >
> junit.framework.TestSuite.createTest(TestSuite.java:131)
> > at
> >
> junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
> > at
> >
> junit.framework.TestSuite.<init>(TestSuite.java:75)
> > at
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3TestLoader.java:102)
> > at
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit3TestLoader.java:59)
> > at
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:445)
> > at
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> > at
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> > at
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> > java.lang.ExceptionInInitializerError
> > at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > Method)
> > at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> > at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> > at
> >
> java.lang.reflect.Constructor.newInstance(Constructor.java:494)
> > at
> >
> junit.framework.TestSuite.createTest(TestSuite.java:131)
> > at
> >
> junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
> > at
> >
> junit.framework.TestSuite.<init>(TestSuite.java:75)
> > at
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3TestLoader.java:102)
> > at
> >
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit3TestLoader.java:59)
> > at
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:445)
> > at
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> > at
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> > at
> >
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> > Caused by: java.lang.RuntimeException: Encountered
> > throwable
> > at
> >
> org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:113)
> > ... 13 more
> >
> ======================================================================
> >
> > regards,
> > Shyam
> >
> >
> >
> >
> >
> > --- On Tue, 2/3/09, Ashish Thusoo
> > <at...@facebook.com> wrote:
> >
> > > From: Ashish Thusoo <at...@facebook.com>
> > > Subject: RE: Eclipse run fails !!
> > > To: "Shyam Sarkar"
> > <sh...@yahoo.com>,
> > > "hive-dev@hadoop.apache.org"
> > <hi...@hadoop.apache.org>
> > > Date: Tuesday, February 3, 2009, 1:46 PM Hi
> Shyam,
> > >
> > > I can certainly say that 0.17.0 should work with
> > eclipse. I have been
> > > doing that for a while.
> > >
> > > Maybe we can concentrate on fixing why you are
> not
> > able to create a
> > > table in hdfs. I am not sure why you could not
> create
> > the
> > > /user/hive/warehouse directory in 0.17. Are you
> saying
> > that
> > >
> > > hadoop dfs -mkdir /user/facebook/hive
> > >
> > > does not work for you? Can you send out the
> output
> > when you run this
> > > command.
> > >
> > > Ashish
> > >
> > > PS: using -Dhadoop.versoion="0.17.0"
> for all
> > the commands that are
> > > given in the wiki should make things work in
> eclipse.
> > >
> > > -----Original Message-----
> > > From: Shyam Sarkar
> [mailto:shyam_sarkar@yahoo.com]
> > > Sent: Tuesday, February 03, 2009 12:00 PM
> > > To: hive-dev@hadoop.apache.org; Ashish Thusoo
> > > Subject: RE: Eclipse run fails !!
> > >
> > > Dear Ashish,
> > >
> > > For the last few days I tried eclipse 3.4.1 with
> > 0.17.2.1 version and
> > > got the same errors with run->run. Then I
> looked
> > into bin/hive command
> > > and found that it could not create table in HDFS.
> The
> > reason was that
> > > I could not create /user/hive/warehouse directory
> > inside HDFS. It was
> > > using Linux FS.
> > > This is why I switched to 0.19.0 where
> directories in
> > HDFS can be
> > > created.
> > >
> > > Could you please tell me which exact version of
> hadoop
> > will work fine
> > > with eclipse runs ? I want to get rid of errors
> in
> > project itself
> > > (before any run).
> > >
> > > Regards,
> > > Shyam
> > >
> > > --- On Tue, 2/3/09, Ashish Thusoo
> > > <at...@facebook.com> wrote:
> > >
> > > > From: Ashish Thusoo
> <at...@facebook.com>
> > > > Subject: RE: Eclipse run fails !!
> > > > To: "hive-dev@hadoop.apache.org"
> > > <hi...@hadoop.apache.org>,
> > > > "shyam_sarkar@yahoo.com"
> > > <sh...@yahoo.com>
> > > > Date: Tuesday, February 3, 2009, 11:38 AM Hi
> > Shyam,
> > > >
> > > > We have not really tried the eclipse stuff
> for
> > 0.19.0.
> > > Is it possible
> > > > for you to use 0.17.0 for now, while we
> figure
> > this
> > > out...
> > > >
> > > > Ashish
> > > >
> > > > -----Original Message-----
> > > > From: Shyam Sarkar
> > [mailto:shyam_sarkar@yahoo.com]
> > > > Sent: Tuesday, February 03, 2009 11:26 AM
> > > > To: hive-dev@hadoop.apache.org
> > > > Subject: Eclipse run fails !!
> > > >
> > > > Hello,
> > > >
> > > > I have hive project loaded inside eclipse
> 3.4.1
> > and
> > > hadoop 0.19.0 is
> > > > running in the background. I could create
> tables
> > from
> > > bin/hive
> > > > command.
> > > > But when I try to run->run inside eclipse
> it
> > says::
> > > >
> > > > "Errors exist with required project(s):
> > > >
> > > > hive
> > > >
> > > > Proceed with launch ?"
> > > >
> > > > and then it gives many errors.
> > > >
> > > > Can someone please tell me why there are
> errors
> > in
> > > project hive ? I
> > > > followed all steps correctly from hive wiki.
> > > >
> > > > Regards,
> > > > shyam_sarkar@yahoo.com
Re: Eclipse run fails !!
Posted by Prasad Chakka <pr...@facebook.com>.
There are compilation errors in Hive project so that is why running tests is causing issues. Could you send what are the compilation errors?
One of the errors should be on following line. It most probably a Eclipse and java issue. You can most probably remove the @override annotation and get successful compilation. If there are any more errors send them to us.
The method getEvalMethod(List<Class<?>>) of type NumericOpMethodResolver must override a superclass method
at org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericOpMethodResolver.java:52)
________________________________
From: Shyam Sarkar <sh...@yahoo.com>
Reply-To: <sh...@yahoo.com>
Date: Tue, 3 Feb 2009 16:51:47 -0800
To: <hi...@hadoop.apache.org>, Prasad Chakka <pr...@facebook.com>
Subject: Re: Eclipse run fails !!
Dear Prasad,
I followed your instructions with 0.17.2.1 hadoop version and changed jre to version 1.6_11. When I ran JUnit test, I still got the following message :
"Errors exist in required Project(s):
hive
Proceed with Launch ?"
When I launched I got following errors ::
=================================== It is long ======================
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1598728140.txt
Begin query: sample6.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff /home/ssarkar/hive/build/ql/test/logs/positive/sample6.q.out /home/ssarkar/hive/ql/src/test/results/compiler/parse/sample6.q.out
Exception: Unresolved compilation problem:
The method getEvalMethod(List<Class<?>>) of type NumericOpMethodResolver must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method getEvalMethod(List<Class<?>>) of type NumericOpMethodResolver must override a superclass method
at org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericOpMethodResolver.java:52)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry.java:274)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:423)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:379)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSamplePredicate(SemanticAnalyzer.java:2872)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genTablePlan(SemanticAnalyzer.java:2985)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3027)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
at org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample6(TestParse.java:1044)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1450160017.txt
Begin query: sample7.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff /home/ssarkar/hive/build/ql/test/logs/positive/sample7.q.out /home/ssarkar/hive/ql/src/test/results/compiler/parse/sample7.q.out
Exception: Unresolved compilation problem:
The method getEvalMethod(List<Class<?>>) of type NumericOpMethodResolver must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method getEvalMethod(List<Class<?>>) of type NumericOpMethodResolver must override a superclass method
at org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericOpMethodResolver.java:52)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry.java:274)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:423)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:379)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSamplePredicate(SemanticAnalyzer.java:2872)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genTablePlan(SemanticAnalyzer.java:2985)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3027)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
at org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample7(TestParse.java:1070)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_514371634.txt
Begin query: subq.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff /home/ssarkar/hive/build/ql/test/logs/positive/subq.q.out /home/ssarkar/hive/ql/src/test/results/compiler/parse/subq.q.out
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyzer.java:904)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2712)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3000)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3021)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
at org.apache.hadoop.hive.ql.parse.TestParse.testParse_subq(TestParse.java:1096)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_520907971.txt
Begin query: udf1.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff /home/ssarkar/hive/build/ql/test/logs/positive/udf1.q.out /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf1.q.out
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyzer.java:904)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2712)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
at org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf1(TestParse.java:1122)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_656206857.txt
Begin query: udf4.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff /home/ssarkar/hive/build/ql/test/logs/positive/udf4.q.out /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf4.q.out
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
at org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf4(TestParse.java:1148)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_545867528.txt
Begin query: udf6.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff /home/ssarkar/hive/build/ql/test/logs/positive/udf6.q.out /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf6.q.out
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
at org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf6(TestParse.java:1174)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1947338661.txt
Begin query: union.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff /home/ssarkar/hive/build/ql/test/logs/positive/union.q.out /home/ssarkar/hive/ql/src/test/results/compiler/parse/union.q.out
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyzer.java:904)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2712)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3000)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3003)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3021)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
at org.apache.hadoop.hive.ql.parse.TestParse.testParse_union(TestParse.java:1200)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Table doesnotexist does not exist
Testing Filter Operator
java.lang.Error: Unresolved compilation problem:
The method getEvalMethod(List<Class<?>>) of type ComparisonOpMethodResolver must override a superclass method
at org.apache.hadoop.hive.ql.exec.ComparisonOpMethodResolver.getEvalMethod(ComparisonOpMethodResolver.java:54)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry.java:274)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:423)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:379)
at org.apache.hadoop.hive.ql.exec.TestOperators.testBaseFilterOperator(TestOperators.java:79)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Testing FileSink Operator
FileSink Operator ok
Testing Script Operator
[0] io.o=[1, 01]
[0] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@acb988
[1] io.o=[2, 11]
[1] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@acb988
[2] io.o=[3, 21]
[2] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@acb988
[3] io.o=[4, 31]
[3] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@acb988
[4] io.o=[5, 41]
[4] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@acb988
Script Operator ok
Testing Map Operator
io1.o.toString() = [[0, 1, 2]]
io2.o.toString() = [[0, 1, 2]]
answer.toString() = [[0, 1, 2]]
io1.o.toString() = [[1, 2, 3]]
io2.o.toString() = [[1, 2, 3]]
answer.toString() = [[1, 2, 3]]
io1.o.toString() = [[2, 3, 4]]
io2.o.toString() = [[2, 3, 4]]
answer.toString() = [[2, 3, 4]]
io1.o.toString() = [[3, 4, 5]]
io2.o.toString() = [[3, 4, 5]]
answer.toString() = [[3, 4, 5]]
io1.o.toString() = [[4, 5, 6]]
io2.o.toString() = [[4, 5, 6]]
answer.toString() = [[4, 5, 6]]
Map Operator ok
JEXL library test ok
Evaluating 1 + 2 for 10000000 times
Evaluation finished: 0.562 seconds, 0.056 seconds/million call.
Evaluating __udf__concat.evaluate("1", "2") for 1000000 times
Evaluation finished: 1.028 seconds, 1.028 seconds/million call.
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1713747826.txt
java.io.FileNotFoundException: join1.q (No such file or directory)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(Unknown Source)
at org.apache.hadoop.hive.ql.QTestUtil.addFile(QTestUtil.java:188)
at org.apache.hadoop.hive.ql.QTestUtil.queryListRunner(QTestUtil.java:751)
at org.apache.hadoop.hive.ql.TestMTQueries.testMTQueries1(TestMTQueries.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
ExprNodeFuncEvaluator ok
ExprNodeColumnEvaluator ok
testExprNodeConversionEvaluator ok
java.lang.Error: Unresolved compilation problem:
The method getEvalMethod(List<Class<?>>) of type NumericOpMethodResolver must override a superclass method
at org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericOpMethodResolver.java:52)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry.java:274)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:423)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:379)
at org.apache.hadoop.hive.ql.exec.TestExpressionEvaluator.testExprNodeSpeed(TestExpressionEvaluator.java:168)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
input struct = [234, [firstString, secondString], {firstKey=1, secondKey=2}, -234, 1.0, -2.5]
Testing protocol: org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
TypeName = struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:double,nd:double}
bytes =x01x80x00x00xeax01x80x00x00x02x01x66x69x72x73x74x53x74x72x69x6ex67x00x01x73x65x63x6fx6ex64x53x74x72x69x6ex67x00x01x80x00x00x02x01x66x69x72x73x74x4bx65x79x00x01x80x00x00x01x01x73x65x63x6fx6ex64x4bx65x79x00x01x80x00x00x02x01x7fxffxffx16x01xbfxf0x00x00x00x00x00x00x01x3fxfbxffxffxffxffxffxff
o class = class java.util.ArrayList
o size = 6
o[0] class = class java.lang.Integer
o[1] class = class java.util.ArrayList
o[2] class = class java.util.HashMap
o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, -234, 1.0, -2.5]
Testing protocol: org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
TypeName = struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:double,nd:double}
bytes =xfex7fxffxffx15xfex7fxffxffxfdxfex99x96x8dx8cx8bxacx8bx8dx96x91x98xffxfex8cx9ax9cx90x91x9bxacx8bx8dx96x91x98xffxfex7fxffxffxfdxfex99x96x8dx8cx8bxb4x9ax86xffxfex7fxffxffxfexfex8cx9ax9cx90x91x9bxb4x9ax86xffxfex7fxffxffxfdxfex80x00x00xe9xfex40x0fxffxffxffxffxffxffxfexc0x04x00x00x00x00x00x00
o class = class java.util.ArrayList
o size = 6
o[0] class = class java.lang.Integer
o[1] class = class java.util.ArrayList
o[2] class = class java.util.HashMap
o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, -234, 1.0, -2.5]
Testing protocol: com.facebook.thrift.protocol.TBinaryProtocol
TypeName = struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:double,nd:double}
bytes =x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x65x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
o class = class java.util.ArrayList
o size = 6
o[0] class = class java.lang.Integer
o[1] class = class java.util.ArrayList
o[2] class = class java.util.HashMap
o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, -234, 1.0, -2.5]
Testing protocol: com.facebook.thrift.protocol.TJSONProtocol
TypeName = struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:double,nd:double}
bytes =x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx22x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2cx22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x64x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x7dx7d
bytes in text ={"-1":{"i32":234},"-2":{"lst":["str",2,"firstString","secondString"]},"-3":{"map":["str","i32",2,{"firstKey":1,"secondKey":2}]},"-4":{"i32":-234},"-5":{"dbl":1.0},"-6":{"dbl":-2.5}}
o class = class java.util.ArrayList
o size = 6
o[0] class = class java.lang.Integer
o[1] class = class java.util.ArrayList
o[2] class = class java.util.HashMap
o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, -234, 1.0, -2.5]
Testing protocol: org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
TypeName = struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:double,nd:double}
bytes =x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
bytes in text =234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
o class = class java.util.ArrayList
o size = 6
o[0] class = class java.lang.Integer
o[1] class = class java.util.ArrayList
o[2] class = class java.util.HashMap
o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, -234, 1.0, -2.5]
Beginning Test testTBinarySortableProtocol:
Testing struct test { double hello}
Testing struct test { i32 hello}
Testing struct test { i64 hello}
Testing struct test { string hello}
Testing struct test { string hello, double another}
Test testTBinarySortableProtocol passed!
bytes in text =234 firstStringsecondString firstKey1secondKey2>
compare to =234 firstStringsecondString firstKey1secondKey2>
o class = class java.util.ArrayList
o size = 3
o[0] class = class java.lang.Integer
o[1] class = class java.util.ArrayList
o[2] class = class java.util.HashMap
o = [234, [firstString, secondString], {firstKey=1, secondKey=2}]
bytes in text =234 firstStringsecondString firstKey1secondKey2>
compare to =234 firstStringsecondString firstKey1secondKey2>
o class = class java.util.ArrayList
o size = 3
o = [234, null, {firstKey=1, secondKey=2}]
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_992344490.txt
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1962723908.txt
OK
OK
Copying data from file:/home/ssarkar/hive/data/files/kv1.txt
Loading data to table testhivedrivertable
OK
OK
OK
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_247426390.txt
Begin query: altern1.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/altern1.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/altern1.q.out
Done query: altern1.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_587924093.txt
Begin query: bad_sample_clause.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/bad_sample_clause.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/bad_sample_clause.q.out
Done query: bad_sample_clause.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1415770190.txt
Begin query: clusterbydistributeby.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/clusterbydistributeby.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/clusterbydistributeby.q.out
Done query: clusterbydistributeby.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1882308680.txt
Begin query: clusterbysortby.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/clusterbysortby.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/clusterbysortby.q.out
Done query: clusterbysortby.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1579535074.txt
Begin query: clustern1.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clustern1(TestNegativeCliDriver.java:205)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-430224382.txt
Begin query: clustern2.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinReduceSinkChild(SemanticAnalyzer.java:2332)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperator(SemanticAnalyzer.java:2380)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinPlan(SemanticAnalyzer.java:2444)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3041)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clustern2(TestNegativeCliDriver.java:230)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-431481701.txt
Begin query: clustern3.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clustern3(TestNegativeCliDriver.java:255)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1179496399.txt
Begin query: clustern4.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clustern4(TestNegativeCliDriver.java:280)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1998238474.txt
Begin query: describe_xpath1.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath1.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath1.q.out
Done query: describe_xpath1.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-93672182.txt
Begin query: describe_xpath2.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath2.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath2.q.out
Done query: describe_xpath2.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1401990633.txt
Begin query: describe_xpath3.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath3.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath3.q.out
Done query: describe_xpath3.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_659750364.txt
Begin query: describe_xpath4.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath4.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath4.q.out
Done query: describe_xpath4.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-778063141.txt
Begin query: fileformat_bad_class.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/fileformat_bad_class.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/fileformat_bad_class.q.out
Done query: fileformat_bad_class.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1389054449.txt
Begin query: fileformat_void_input.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyzer.java:904)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2712)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_fileformat_void_input(TestNegativeCliDriver.java:430)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_893718016.txt
Begin query: fileformat_void_output.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/fileformat_void_output.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/fileformat_void_output.q.out
Done query: fileformat_void_output.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1795879737.txt
Begin query: input1.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/input1.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/input1.q.out
Done query: input1.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1786217678.txt
Begin query: input2.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input2(TestNegativeCliDriver.java:505)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1429356131.txt
Begin query: input_testxpath4.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input_testxpath4(TestNegativeCliDriver.java:530)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-299734685.txt
Begin query: invalid_create_tbl1.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_create_tbl1.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_create_tbl1.q.out
Done query: invalid_create_tbl1.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-3796110.txt
Begin query: invalid_create_tbl2.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_create_tbl2.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_create_tbl2.q.out
Done query: invalid_create_tbl2.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_732040395.txt
Begin query: invalid_select_expression.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_select_expression.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_select_expression.q.out
Done query: invalid_select_expression.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_764555300.txt
Begin query: invalid_tbl_name.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_tbl_name.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_tbl_name.q.out
Done query: invalid_tbl_name.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1388068500.txt
Begin query: joinneg.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/joinneg.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/joinneg.q.out
Done query: joinneg.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1214860.txt
Begin query: load_wrong_fileformat.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/load_wrong_fileformat.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/load_wrong_fileformat.q.out
Done query: load_wrong_fileformat.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1542677940.txt
Begin query: notable_alias3.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyzer.java:904)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2712)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notable_alias3(TestNegativeCliDriver.java:705)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-555682788.txt
Begin query: notable_alias4.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinReduceSinkChild(SemanticAnalyzer.java:2332)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperator(SemanticAnalyzer.java:2380)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinPlan(SemanticAnalyzer.java:2444)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3041)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notable_alias4(TestNegativeCliDriver.java:730)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1604113442.txt
Begin query: strict_pruning.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.NumExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.NumExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$NumExprProcessor.process(TypeCheckProcFactory.java:121)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genGroupByPlanReduceSinkOperator(SemanticAnalyzer.java:1688)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genGroupByPlan2MR(SemanticAnalyzer.java:1892)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2721)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_strict_pruning(TestNegativeCliDriver.java:755)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-327058962.txt
Begin query: subq_insert.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/subq_insert.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/subq_insert.q.out
Done query: subq_insert.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-196827093.txt
Begin query: union.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/union.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/union.q.out
Done query: union.q
=====================================================================
Thanks,
Shyam
--- On Tue, 2/3/09, Prasad Chakka <pr...@facebook.com> wrote:
> From: Prasad Chakka <pr...@facebook.com>
> Subject: Re: Eclipse run fails !!
> To: "hive-dev@hadoop.apache.org" <hi...@hadoop.apache.org>, "shyam_sarkar@yahoo.com" <sh...@yahoo.com>
> Date: Tuesday, February 3, 2009, 2:51 PM
> I think there are multiple issues. Please do the following
>
>
> 1. 'ant clean' in hive directory
> 2. delete project in eclipse
> 3. Don't change any config values in hive-site.xml
> (revert your changes to fs.default.name etc) and don't
> start HDFS cluster since in unit tests we are working on
> local file system.
> 4. check what java version is 1.6
> 5. Follow the steps in the hive eclipse setup wiki with
> -Dhadoop.version=0.17.2.1
> 6. Open Eclipse and import the project
> 7. Open project preferences and make sure that it is
> using java 6. If it is not then change it to use java6 (let
> me know if you need help here). If you change it then make
> sure that you rebuild the project by doing a clean
> 8. Make sure that there are no compilation problems for
> the hive project (check 'problems' tab in the bottom
> panel of Eclipse)
> 9. Run the Junit test case. It should run without any
> warning dialogs
>
> Let me know which of these steps fail and what is the
> error. You need not change any files run a junit testcase.
> Once you are at this point, we can help you in setting up
> command shell that talks to DFS.
>
> Prasad
>
> ________________________________
> From: Ashish Thusoo <at...@facebook.com>
> Reply-To: <hi...@hadoop.apache.org>
> Date: Tue, 3 Feb 2009 14:41:12 -0800
> To: <sh...@yahoo.com>,
> <hi...@hadoop.apache.org>
> Subject: RE: Eclipse run fails !!
>
> Actually for running hive through eclipse you don't
> need to download and start hadoop. Hive tests automatically
> create a local instance of hdfs and map/reduce and are able
> to run it.
>
> The errors that you are getting seem to indicate some jpox
> plugins missing in eclipse. Prasad is an expert in that area
> and can perhaps comment on that...
>
> Ashish
>
> -----Original Message-----
> From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
> Sent: Tuesday, February 03, 2009 2:30 PM
> To: hive-dev@hadoop.apache.org; Ashish Thusoo
> Subject: RE: Eclipse run fails !!
>
> Dear Ashish,
>
> I downloaded hadoop 0.17.0 and tried bin/start-all.sh
> script. I got one error ::
> ==============================================================
> [ssarkar@ayush2 hadoop-0.17.0]$ bin/start-all.sh starting
> namenode, logging to
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-namenode-ayush2.out
> ssarkar@localhost's password:
> localhost: starting datanode, logging to
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-datanode-ayush2.out
> ssarkar@localhost's password:
> localhost: starting secondarynamenode, logging to
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-secondarynamenode-ayush2.out
> localhost: Exception in thread "main"
> java.lang.NullPointerException
> localhost: at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:119)
> localhost: at
> org.apache.hadoop.dfs.SecondaryNameNode.<init>(SecondaryNameNode.java:118)
> localhost: at
> org.apache.hadoop.dfs.SecondaryNameNode.main(SecondaryNameNode.java:495)
> starting jobtracker, logging to
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-jobtracker-ayush2.out
> ssarkar@localhost's password:
> localhost: starting tasktracker, logging to
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-tasktracker-ayush2.out
> [ssarkar@ayush2 hadoop-0.17.0]$
> ===================================================================
>
> Next I loaded hive project into eclipse following steps in
> hive wiki.
> I tried Run->Run Configurations->JUnit and selected
> TestTruncate to run but got the following error ::
>
> "Errors exist in required Projest(s):
>
> hive
>
> Proceed with Launch ?"
>
> When I launch I got following errors ::
>
> =================================================================
> 09/02/03 14:01:33 INFO metastore.HiveMetaStore: 0: Opening
> raw store with implemenation
> class:org.apache.hadoop.hive.metastore.ObjectStore
> 09/02/03 14:01:33 INFO metastore.ObjectStore: ObjectStore,
> initialize called
> 09/02/03 14:01:33 INFO metastore.ObjectStore: found
> resource jpox.properties at
> file:/home/ssarkar/hive/conf/jpox.properties
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.views" not registered, but plugin
> "org.eclipse.jdt.junit" defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.perspectiveExtensions" not
> registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.preferencePages" not registered,
> but plugin "org.eclipse.jdt.junit" defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.keywords" not registered, but
> plugin "org.eclipse.jdt.junit" defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.debug.core.launchConfigurationTypes"
> not registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.debug.core.launchConfigurationComparators"
> not registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.debug.ui.launchConfigurationTypeImages"
> not registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.debug.ui.launchConfigurationTabGroups"
> not registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.newWizards" not registered, but
> plugin "org.eclipse.jdt.junit" defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.popupMenus" not registered, but
> plugin "org.eclipse.jdt.junit" defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.actionSets" not registered, but
> plugin "org.eclipse.jdt.junit" defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.actionSetPartAssociations" not
> registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.debug.ui.launchShortcuts" not
> registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.jdt.core.classpathVariableInitializer"
> not registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.jdt.ui.quickFixProcessors" not
> registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.jdt.ui.classpathFixProcessors" not
> registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.ide.markerResolution" not
> registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.core.expressions.propertyTesters" not
> registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ltk.core.refactoring.renameParticipants"
> not registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.commands" not registered, but
> plugin "org.eclipse.jdt.junit" defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.bindings" not registered, but
> plugin "org.eclipse.jdt.junit" defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.core.runtime.preferences" not
> registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.jdt.core.classpathContainerInitializer"
> not registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.jdt.ui.classpathContainerPage" not
> registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.ui.ide" but it cannot be resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.ui.views" but it cannot be resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.jface.text" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.ui.workbench.texteditor" but it
> cannot be resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.ui.editors" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.ui" but it cannot be resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.core.expressions" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.core.resources" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.debug.core" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.debug.ui" but it cannot be resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.jdt.core" but it cannot be resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.jdt.ui" but it cannot be resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.core.runtime" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.jdt.launching" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.jdt.debug.ui" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.compare" but it cannot be resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.ltk.core.refactoring" but it cannot
> be resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.core.variables" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.ltk.ui.refactoring" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit.runtime" requires
> "org.junit" but it cannot be resolved.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Bundle
> "org.jpox" has an optional dependency to
> "org.eclipse.equinox.registry" but it cannot be
> resolved
> 09/02/03 14:01:33 WARN JPOX.Plugin: Bundle
> "org.jpox" has an optional dependency to
> "org.eclipse.core.runtime" but it cannot be
> resolved
> 09/02/03 14:01:33 INFO JPOX.Persistence: =================
> Persistence Configuration ===============
> 09/02/03 14:01:33 INFO JPOX.Persistence: JPOX Persistence
> Factory - Vendor: "JPOX" Version:
> "1.2.2"
> 09/02/03 14:01:33 INFO JPOX.Persistence: JPOX Persistence
> Factory initialised for datastore
> URL="jdbc:derby:;databaseName=../build/test/junit_metastore_db;create=true"
> driver="org.apache.derby.jdbc.EmbeddedDriver"
> userName="APP"
> 09/02/03 14:01:33 INFO JPOX.Persistence:
> ===========================================================
> 09/02/03 14:01:35 INFO Datastore.Schema: Initialising
> Catalog "", Schema "APP" using
> "SchemaTable" auto-start option
> 09/02/03 14:01:36 INFO Datastore.Schema: Catalog
> "", Schema "APP" initialised - managing
> 0 classes
> 09/02/03 14:01:36 INFO JPOX.JDO: >> Found
> StoreManager org.jpox.store.rdbms.RDBMSManager
> java.lang.UnsupportedClassVersionError: Bad version number
> in .class file
> at java.lang.ClassLoader.defineClass1(Native
> Method)
> at
> java.lang.ClassLoader.defineClass(ClassLoader.java:620)
> at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
> at
> java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
> at
> java.net.URLClassLoader.access$100(URLClassLoader.java:56)
> at
> java.net.URLClassLoader$1.run(URLClassLoader.java:195)
> at
> java.security.AccessController.doPrivileged(Native Method)
> at
> java.net.URLClassLoader.findClass(URLClassLoader.java:188)
> at
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
> at
> java.lang.ClassLoader.loadClass(ClassLoader.java:251)
> at
> java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:180)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:194)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:124)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:103)
> at
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:54)
> at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:82)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:127)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:143)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:115)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:100)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:73)
> at
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:785)
> at
> org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:798)
> at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:316)
> at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:300)
> at
> org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:105)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> at
> java.lang.reflect.Constructor.newInstance(Constructor.java:494)
> at
> junit.framework.TestSuite.createTest(TestSuite.java:131)
> at
> junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
> at
> junit.framework.TestSuite.<init>(TestSuite.java:75)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3TestLoader.java:102)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit3TestLoader.java:59)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:445)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> java.lang.ExceptionInInitializerError
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> at
> java.lang.reflect.Constructor.newInstance(Constructor.java:494)
> at
> junit.framework.TestSuite.createTest(TestSuite.java:131)
> at
> junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
> at
> junit.framework.TestSuite.<init>(TestSuite.java:75)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3TestLoader.java:102)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit3TestLoader.java:59)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:445)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Caused by: java.lang.RuntimeException: Encountered
> throwable
> at
> org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:113)
> ... 13 more
> ======================================================================
>
> regards,
> Shyam
>
>
>
>
>
> --- On Tue, 2/3/09, Ashish Thusoo
> <at...@facebook.com> wrote:
>
> > From: Ashish Thusoo <at...@facebook.com>
> > Subject: RE: Eclipse run fails !!
> > To: "Shyam Sarkar"
> <sh...@yahoo.com>,
> > "hive-dev@hadoop.apache.org"
> <hi...@hadoop.apache.org>
> > Date: Tuesday, February 3, 2009, 1:46 PM Hi Shyam,
> >
> > I can certainly say that 0.17.0 should work with
> eclipse. I have been
> > doing that for a while.
> >
> > Maybe we can concentrate on fixing why you are not
> able to create a
> > table in hdfs. I am not sure why you could not create
> the
> > /user/hive/warehouse directory in 0.17. Are you saying
> that
> >
> > hadoop dfs -mkdir /user/facebook/hive
> >
> > does not work for you? Can you send out the output
> when you run this
> > command.
> >
> > Ashish
> >
> > PS: using -Dhadoop.versoion="0.17.0" for all
> the commands that are
> > given in the wiki should make things work in eclipse.
> >
> > -----Original Message-----
> > From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
> > Sent: Tuesday, February 03, 2009 12:00 PM
> > To: hive-dev@hadoop.apache.org; Ashish Thusoo
> > Subject: RE: Eclipse run fails !!
> >
> > Dear Ashish,
> >
> > For the last few days I tried eclipse 3.4.1 with
> 0.17.2.1 version and
> > got the same errors with run->run. Then I looked
> into bin/hive command
> > and found that it could not create table in HDFS. The
> reason was that
> > I could not create /user/hive/warehouse directory
> inside HDFS. It was
> > using Linux FS.
> > This is why I switched to 0.19.0 where directories in
> HDFS can be
> > created.
> >
> > Could you please tell me which exact version of hadoop
> will work fine
> > with eclipse runs ? I want to get rid of errors in
> project itself
> > (before any run).
> >
> > Regards,
> > Shyam
> >
> > --- On Tue, 2/3/09, Ashish Thusoo
> > <at...@facebook.com> wrote:
> >
> > > From: Ashish Thusoo <at...@facebook.com>
> > > Subject: RE: Eclipse run fails !!
> > > To: "hive-dev@hadoop.apache.org"
> > <hi...@hadoop.apache.org>,
> > > "shyam_sarkar@yahoo.com"
> > <sh...@yahoo.com>
> > > Date: Tuesday, February 3, 2009, 11:38 AM Hi
> Shyam,
> > >
> > > We have not really tried the eclipse stuff for
> 0.19.0.
> > Is it possible
> > > for you to use 0.17.0 for now, while we figure
> this
> > out...
> > >
> > > Ashish
> > >
> > > -----Original Message-----
> > > From: Shyam Sarkar
> [mailto:shyam_sarkar@yahoo.com]
> > > Sent: Tuesday, February 03, 2009 11:26 AM
> > > To: hive-dev@hadoop.apache.org
> > > Subject: Eclipse run fails !!
> > >
> > > Hello,
> > >
> > > I have hive project loaded inside eclipse 3.4.1
> and
> > hadoop 0.19.0 is
> > > running in the background. I could create tables
> from
> > bin/hive
> > > command.
> > > But when I try to run->run inside eclipse it
> says::
> > >
> > > "Errors exist with required project(s):
> > >
> > > hive
> > >
> > > Proceed with launch ?"
> > >
> > > and then it gives many errors.
> > >
> > > Can someone please tell me why there are errors
> in
> > project hive ? I
> > > followed all steps correctly from hive wiki.
> > >
> > > Regards,
> > > shyam_sarkar@yahoo.com
Re: Eclipse run fails !!
Posted by Shyam Sarkar <sh...@yahoo.com>.
Dear Prasad,
I followed your instructions with 0.17.2.1 hadoop version and changed jre to version 1.6_11. When I ran JUnit test, I still got the following message :
"Errors exist in required Project(s):
hive
Proceed with Launch ?"
When I launched I got following errors ::
=================================== It is long ======================
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1598728140.txt
Begin query: sample6.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff /home/ssarkar/hive/build/ql/test/logs/positive/sample6.q.out /home/ssarkar/hive/ql/src/test/results/compiler/parse/sample6.q.out
Exception: Unresolved compilation problem:
The method getEvalMethod(List<Class<?>>) of type NumericOpMethodResolver must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method getEvalMethod(List<Class<?>>) of type NumericOpMethodResolver must override a superclass method
at org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericOpMethodResolver.java:52)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry.java:274)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:423)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:379)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSamplePredicate(SemanticAnalyzer.java:2872)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genTablePlan(SemanticAnalyzer.java:2985)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3027)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
at org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample6(TestParse.java:1044)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1450160017.txt
Begin query: sample7.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff /home/ssarkar/hive/build/ql/test/logs/positive/sample7.q.out /home/ssarkar/hive/ql/src/test/results/compiler/parse/sample7.q.out
Exception: Unresolved compilation problem:
The method getEvalMethod(List<Class<?>>) of type NumericOpMethodResolver must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method getEvalMethod(List<Class<?>>) of type NumericOpMethodResolver must override a superclass method
at org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericOpMethodResolver.java:52)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry.java:274)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:423)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:379)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSamplePredicate(SemanticAnalyzer.java:2872)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genTablePlan(SemanticAnalyzer.java:2985)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3027)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
at org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample7(TestParse.java:1070)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_514371634.txt
Begin query: subq.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff /home/ssarkar/hive/build/ql/test/logs/positive/subq.q.out /home/ssarkar/hive/ql/src/test/results/compiler/parse/subq.q.out
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyzer.java:904)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2712)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3000)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3021)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
at org.apache.hadoop.hive.ql.parse.TestParse.testParse_subq(TestParse.java:1096)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_520907971.txt
Begin query: udf1.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff /home/ssarkar/hive/build/ql/test/logs/positive/udf1.q.out /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf1.q.out
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyzer.java:904)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2712)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
at org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf1(TestParse.java:1122)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_656206857.txt
Begin query: udf4.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff /home/ssarkar/hive/build/ql/test/logs/positive/udf4.q.out /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf4.q.out
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
at org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf4(TestParse.java:1148)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_545867528.txt
Begin query: udf6.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff /home/ssarkar/hive/build/ql/test/logs/positive/udf6.q.out /home/ssarkar/hive/ql/src/test/results/compiler/parse/udf6.q.out
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
at org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf6(TestParse.java:1174)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1947338661.txt
Begin query: union.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff /home/ssarkar/hive/build/ql/test/logs/positive/union.q.out /home/ssarkar/hive/ql/src/test/results/compiler/parse/union.q.out
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyzer.java:904)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2712)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3000)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3003)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3021)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.QTestUtil.analyzeAST(QTestUtil.java:691)
at org.apache.hadoop.hive.ql.parse.TestParse.testParse_union(TestParse.java:1200)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Table doesnotexist does not exist
Testing Filter Operator
java.lang.Error: Unresolved compilation problem:
The method getEvalMethod(List<Class<?>>) of type ComparisonOpMethodResolver must override a superclass method
at org.apache.hadoop.hive.ql.exec.ComparisonOpMethodResolver.getEvalMethod(ComparisonOpMethodResolver.java:54)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry.java:274)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:423)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:379)
at org.apache.hadoop.hive.ql.exec.TestOperators.testBaseFilterOperator(TestOperators.java:79)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Testing FileSink Operator
FileSink Operator ok
Testing Script Operator
[0] io.o=[1, 01]
[0] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@acb988
[1] io.o=[2, 11]
[1] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@acb988
[2] io.o=[3, 21]
[2] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@acb988
[3] io.o=[4, 31]
[3] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@acb988
[4] io.o=[5, 41]
[4] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@acb988
Script Operator ok
Testing Map Operator
io1.o.toString() = [[0, 1, 2]]
io2.o.toString() = [[0, 1, 2]]
answer.toString() = [[0, 1, 2]]
io1.o.toString() = [[1, 2, 3]]
io2.o.toString() = [[1, 2, 3]]
answer.toString() = [[1, 2, 3]]
io1.o.toString() = [[2, 3, 4]]
io2.o.toString() = [[2, 3, 4]]
answer.toString() = [[2, 3, 4]]
io1.o.toString() = [[3, 4, 5]]
io2.o.toString() = [[3, 4, 5]]
answer.toString() = [[3, 4, 5]]
io1.o.toString() = [[4, 5, 6]]
io2.o.toString() = [[4, 5, 6]]
answer.toString() = [[4, 5, 6]]
Map Operator ok
JEXL library test ok
Evaluating 1 + 2 for 10000000 times
Evaluation finished: 0.562 seconds, 0.056 seconds/million call.
Evaluating __udf__concat.evaluate("1", "2") for 1000000 times
Evaluation finished: 1.028 seconds, 1.028 seconds/million call.
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1713747826.txt
java.io.FileNotFoundException: join1.q (No such file or directory)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(Unknown Source)
at org.apache.hadoop.hive.ql.QTestUtil.addFile(QTestUtil.java:188)
at org.apache.hadoop.hive.ql.QTestUtil.queryListRunner(QTestUtil.java:751)
at org.apache.hadoop.hive.ql.TestMTQueries.testMTQueries1(TestMTQueries.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
ExprNodeFuncEvaluator ok
ExprNodeColumnEvaluator ok
testExprNodeConversionEvaluator ok
java.lang.Error: Unresolved compilation problem:
The method getEvalMethod(List<Class<?>>) of type NumericOpMethodResolver must override a superclass method
at org.apache.hadoop.hive.ql.exec.NumericOpMethodResolver.getEvalMethod(NumericOpMethodResolver.java:52)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.getUDFMethod(FunctionRegistry.java:274)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:423)
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getFuncExprNodeDesc(TypeCheckProcFactory.java:379)
at org.apache.hadoop.hive.ql.exec.TestExpressionEvaluator.testExprNodeSpeed(TestExpressionEvaluator.java:168)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
input struct = [234, [firstString, secondString], {firstKey=1, secondKey=2}, -234, 1.0, -2.5]
Testing protocol: org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
TypeName = struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:double,nd:double}
bytes =x01x80x00x00xeax01x80x00x00x02x01x66x69x72x73x74x53x74x72x69x6ex67x00x01x73x65x63x6fx6ex64x53x74x72x69x6ex67x00x01x80x00x00x02x01x66x69x72x73x74x4bx65x79x00x01x80x00x00x01x01x73x65x63x6fx6ex64x4bx65x79x00x01x80x00x00x02x01x7fxffxffx16x01xbfxf0x00x00x00x00x00x00x01x3fxfbxffxffxffxffxffxff
o class = class java.util.ArrayList
o size = 6
o[0] class = class java.lang.Integer
o[1] class = class java.util.ArrayList
o[2] class = class java.util.HashMap
o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, -234, 1.0, -2.5]
Testing protocol: org.apache.hadoop.hive.serde2.thrift.TBinarySortableProtocol
TypeName = struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:double,nd:double}
bytes =xfex7fxffxffx15xfex7fxffxffxfdxfex99x96x8dx8cx8bxacx8bx8dx96x91x98xffxfex8cx9ax9cx90x91x9bxacx8bx8dx96x91x98xffxfex7fxffxffxfdxfex99x96x8dx8cx8bxb4x9ax86xffxfex7fxffxffxfexfex8cx9ax9cx90x91x9bxb4x9ax86xffxfex7fxffxffxfdxfex80x00x00xe9xfex40x0fxffxffxffxffxffxffxfexc0x04x00x00x00x00x00x00
o class = class java.util.ArrayList
o size = 6
o[0] class = class java.lang.Integer
o[1] class = class java.util.ArrayList
o[2] class = class java.util.HashMap
o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, -234, 1.0, -2.5]
Testing protocol: com.facebook.thrift.protocol.TBinaryProtocol
TypeName = struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:double,nd:double}
bytes =x08xffxffx00x00x00xeax0fxffxfex0bx00x00x00x02x00x00x00x0bx66x69x72x73x74x53x74x72x69x6ex67x00x00x00x0cx73x65x63x6fx6ex64x53x74x72x69x6ex67x0dxffxfdx0bx08x00x00x00x02x00x00x00x08x66x69x72x73x74x4bx65x79x00x00x00x01x00x00x00x09x73x65x63x6fx6ex64x4bx65x79x00x00x00x02x08xffxfcxffxffxffx16x04xffxfbx3fxf0x00x00x00x00x00x00x04xffxfaxc0x04x00x00x00x00x00x00x00
o class = class java.util.ArrayList
o size = 6
o[0] class = class java.lang.Integer
o[1] class = class java.util.ArrayList
o[2] class = class java.util.HashMap
o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, -234, 1.0, -2.5]
Testing protocol: com.facebook.thrift.protocol.TJSONProtocol
TypeName = struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:double,nd:double}
bytes =x7bx22x2dx31x22x3ax7bx22x69x33x32x22x3ax32x33x34x7dx2cx22x2dx32x22x3ax7bx22x6cx73x74x22x3ax5bx22x73x74x72x22x2cx32x2cx22x66x69x72x73x74x53x74x72x69x6ex67x22x2cx22x73x65x63x6fx6ex64x53x74x72x69x6ex67x22x5dx7dx2cx22x2dx33x22x3ax7bx22x6dx61x70x22x3ax5bx22x73x74x72x22x2cx22x69x33x32x22x2cx32x2cx7bx22x66x69x72x73x74x4bx65x79x22x3ax31x2cx22x73x65x63x6fx6ex64x4bx65x79x22x3ax32x7dx5dx7dx2cx22x2dx34x22x3ax7bx22x69x33x32x22x3ax2dx32x33x34x7dx2cx22x2dx35x22x3ax7bx22x64x62x6cx22x3ax31x2ex30x7dx2cx22x2dx36x22x3ax7bx22x64x62x6cx22x3ax2dx32x2ex35x7dx7d
bytes in text ={"-1":{"i32":234},"-2":{"lst":["str",2,"firstString","secondString"]},"-3":{"map":["str","i32",2,{"firstKey":1,"secondKey":2}]},"-4":{"i32":-234},"-5":{"dbl":1.0},"-6":{"dbl":-2.5}}
o class = class java.util.ArrayList
o size = 6
o[0] class = class java.lang.Integer
o[1] class = class java.util.ArrayList
o[2] class = class java.util.HashMap
o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, -234, 1.0, -2.5]
Testing protocol: org.apache.hadoop.hive.serde2.thrift.TCTLSeparatedProtocol
TypeName = struct{_hello:int,2bye:array<string>,another:map<string,int>,nhello:int,d:double,nd:double}
bytes =x32x33x34x01x66x69x72x73x74x53x74x72x69x6ex67x02x73x65x63x6fx6ex64x53x74x72x69x6ex67x01x66x69x72x73x74x4bx65x79x03x31x02x73x65x63x6fx6ex64x4bx65x79x03x32x01x2dx32x33x34x01x31x2ex30x01x2dx32x2ex35
bytes in text =234firstStringsecondStringfirstKey1secondKey2-2341.0-2.5
o class = class java.util.ArrayList
o size = 6
o[0] class = class java.lang.Integer
o[1] class = class java.util.ArrayList
o[2] class = class java.util.HashMap
o = [234, [firstString, secondString], {firstKey=1, secondKey=2}, -234, 1.0, -2.5]
Beginning Test testTBinarySortableProtocol:
Testing struct test { double hello}
Testing struct test { i32 hello}
Testing struct test { i64 hello}
Testing struct test { string hello}
Testing struct test { string hello, double another}
Test testTBinarySortableProtocol passed!
bytes in text =234 firstStringsecondString firstKey1secondKey2>
compare to =234 firstStringsecondString firstKey1secondKey2>
o class = class java.util.ArrayList
o size = 3
o[0] class = class java.lang.Integer
o[1] class = class java.util.ArrayList
o[2] class = class java.util.HashMap
o = [234, [firstString, secondString], {firstKey=1, secondKey=2}]
bytes in text =234 firstStringsecondString firstKey1secondKey2>
compare to =234 firstStringsecondString firstKey1secondKey2>
o class = class java.util.ArrayList
o size = 3
o = [234, null, {firstKey=1, secondKey=2}]
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_992344490.txt
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1962723908.txt
OK
OK
Copying data from file:/home/ssarkar/hive/data/files/kv1.txt
Loading data to table testhivedrivertable
OK
OK
OK
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_247426390.txt
Begin query: altern1.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/altern1.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/altern1.q.out
Done query: altern1.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_587924093.txt
Begin query: bad_sample_clause.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/bad_sample_clause.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/bad_sample_clause.q.out
Done query: bad_sample_clause.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1415770190.txt
Begin query: clusterbydistributeby.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/clusterbydistributeby.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/clusterbydistributeby.q.out
Done query: clusterbydistributeby.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1882308680.txt
Begin query: clusterbysortby.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/clusterbysortby.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/clusterbysortby.q.out
Done query: clusterbysortby.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1579535074.txt
Begin query: clustern1.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clustern1(TestNegativeCliDriver.java:205)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-430224382.txt
Begin query: clustern2.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinReduceSinkChild(SemanticAnalyzer.java:2332)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperator(SemanticAnalyzer.java:2380)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinPlan(SemanticAnalyzer.java:2444)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3041)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clustern2(TestNegativeCliDriver.java:230)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-431481701.txt
Begin query: clustern3.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clustern3(TestNegativeCliDriver.java:255)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1179496399.txt
Begin query: clustern4.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clustern4(TestNegativeCliDriver.java:280)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1998238474.txt
Begin query: describe_xpath1.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath1.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath1.q.out
Done query: describe_xpath1.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-93672182.txt
Begin query: describe_xpath2.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath2.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath2.q.out
Done query: describe_xpath2.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1401990633.txt
Begin query: describe_xpath3.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath3.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath3.q.out
Done query: describe_xpath3.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_659750364.txt
Begin query: describe_xpath4.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/describe_xpath4.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/describe_xpath4.q.out
Done query: describe_xpath4.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-778063141.txt
Begin query: fileformat_bad_class.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/fileformat_bad_class.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/fileformat_bad_class.q.out
Done query: fileformat_bad_class.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1389054449.txt
Begin query: fileformat_void_input.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyzer.java:904)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2712)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_fileformat_void_input(TestNegativeCliDriver.java:430)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_893718016.txt
Begin query: fileformat_void_output.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/fileformat_void_output.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/fileformat_void_output.q.out
Done query: fileformat_void_output.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1795879737.txt
Begin query: input1.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/input1.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/input1.q.out
Done query: input1.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1786217678.txt
Begin query: input2.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input2(TestNegativeCliDriver.java:505)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1429356131.txt
Begin query: input_testxpath4.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:1167)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2724)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input_testxpath4(TestNegativeCliDriver.java:530)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-299734685.txt
Begin query: invalid_create_tbl1.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_create_tbl1.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_create_tbl1.q.out
Done query: invalid_create_tbl1.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-3796110.txt
Begin query: invalid_create_tbl2.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_create_tbl2.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_create_tbl2.q.out
Done query: invalid_create_tbl2.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_732040395.txt
Begin query: invalid_select_expression.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_select_expression.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_select_expression.q.out
Done query: invalid_select_expression.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_764555300.txt
Begin query: invalid_tbl_name.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/invalid_tbl_name.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/invalid_tbl_name.q.out
Done query: invalid_tbl_name.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1388068500.txt
Begin query: joinneg.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/joinneg.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/joinneg.q.out
Done query: joinneg.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_1214860.txt
Begin query: load_wrong_fileformat.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/load_wrong_fileformat.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/load_wrong_fileformat.q.out
Done query: load_wrong_fileformat.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1542677940.txt
Begin query: notable_alias3.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFilterPlan(SemanticAnalyzer.java:904)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2712)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notable_alias3(TestNegativeCliDriver.java:705)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-555682788.txt
Begin query: notable_alias4.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.StrExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$StrExprProcessor.process(TypeCheckProcFactory.java:163)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinReduceSinkChild(SemanticAnalyzer.java:2332)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinOperator(SemanticAnalyzer.java:2380)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genJoinPlan(SemanticAnalyzer.java:2444)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3041)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notable_alias4(TestNegativeCliDriver.java:730)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-1604113442.txt
Begin query: strict_pruning.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
Exception: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.NumExprProcessor must override a superclass method
java.lang.Error: Unresolved compilation problem:
The method process(Node, NodeProcessorCtx, Object...) of type TypeCheckProcFactory.NumExprProcessor must override a superclass method
at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$NumExprProcessor.process(TypeCheckProcFactory.java:121)
at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:80)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:83)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:113)
at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:95)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:3311)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genGroupByPlanReduceSinkOperator(SemanticAnalyzer.java:1688)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genGroupByPlan2MR(SemanticAnalyzer.java:1892)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:2721)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:3048)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:3229)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:43)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:71)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:193)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:174)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:207)
at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:411)
at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_strict_pruning(TestNegativeCliDriver.java:755)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at junit.framework.TestCase.runTest(TestCase.java:154)
at junit.framework.TestCase.runBare(TestCase.java:127)
at junit.framework.TestResult$1.protect(TestResult.java:106)
at junit.framework.TestResult.runProtected(TestResult.java:124)
at junit.framework.TestResult.run(TestResult.java:109)
at junit.framework.TestCase.run(TestCase.java:118)
at junit.framework.TestSuite.runTest(TestSuite.java:208)
at junit.framework.TestSuite.run(TestSuite.java:203)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestReference.run(JUnit3TestReference.java:130)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-327058962.txt
Begin query: subq_insert.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/subq_insert.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/subq_insert.q.out
Done query: subq_insert.q
Hive history file=/home/ssarkar/hive/ql/../build/ql/tmp/hive_job_log_ssarkar_200902031616_-196827093.txt
Begin query: union.q
Loading data to table srcpart partition {ds=2008-04-08, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-08, hr=12}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=11}
OK
Loading data to table srcpart partition {ds=2008-04-09, hr=12}
OK
Loading data to table srcbucket
OK
Loading data to table srcbucket
OK
Loading data to table src
OK
diff -I \(file:\)\|\(/tmp/.*\) /home/ssarkar/hive/build/ql/test/logs/clientnegative/union.q.out /home/ssarkar/hive/ql/src/test/results/clientnegative/union.q.out
Done query: union.q
=====================================================================
Thanks,
Shyam
--- On Tue, 2/3/09, Prasad Chakka <pr...@facebook.com> wrote:
> From: Prasad Chakka <pr...@facebook.com>
> Subject: Re: Eclipse run fails !!
> To: "hive-dev@hadoop.apache.org" <hi...@hadoop.apache.org>, "shyam_sarkar@yahoo.com" <sh...@yahoo.com>
> Date: Tuesday, February 3, 2009, 2:51 PM
> I think there are multiple issues. Please do the following
>
>
> 1. 'ant clean' in hive directory
> 2. delete project in eclipse
> 3. Don't change any config values in hive-site.xml
> (revert your changes to fs.default.name etc) and don't
> start HDFS cluster since in unit tests we are working on
> local file system.
> 4. check what java version is 1.6
> 5. Follow the steps in the hive eclipse setup wiki with
> -Dhadoop.version=0.17.2.1
> 6. Open Eclipse and import the project
> 7. Open project preferences and make sure that it is
> using java 6. If it is not then change it to use java6 (let
> me know if you need help here). If you change it then make
> sure that you rebuild the project by doing a clean
> 8. Make sure that there are no compilation problems for
> the hive project (check 'problems' tab in the bottom
> panel of Eclipse)
> 9. Run the Junit test case. It should run without any
> warning dialogs
>
> Let me know which of these steps fail and what is the
> error. You need not change any files run a junit testcase.
> Once you are at this point, we can help you in setting up
> command shell that talks to DFS.
>
> Prasad
>
> ________________________________
> From: Ashish Thusoo <at...@facebook.com>
> Reply-To: <hi...@hadoop.apache.org>
> Date: Tue, 3 Feb 2009 14:41:12 -0800
> To: <sh...@yahoo.com>,
> <hi...@hadoop.apache.org>
> Subject: RE: Eclipse run fails !!
>
> Actually for running hive through eclipse you don't
> need to download and start hadoop. Hive tests automatically
> create a local instance of hdfs and map/reduce and are able
> to run it.
>
> The errors that you are getting seem to indicate some jpox
> plugins missing in eclipse. Prasad is an expert in that area
> and can perhaps comment on that...
>
> Ashish
>
> -----Original Message-----
> From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
> Sent: Tuesday, February 03, 2009 2:30 PM
> To: hive-dev@hadoop.apache.org; Ashish Thusoo
> Subject: RE: Eclipse run fails !!
>
> Dear Ashish,
>
> I downloaded hadoop 0.17.0 and tried bin/start-all.sh
> script. I got one error ::
> ==============================================================
> [ssarkar@ayush2 hadoop-0.17.0]$ bin/start-all.sh starting
> namenode, logging to
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-namenode-ayush2.out
> ssarkar@localhost's password:
> localhost: starting datanode, logging to
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-datanode-ayush2.out
> ssarkar@localhost's password:
> localhost: starting secondarynamenode, logging to
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-secondarynamenode-ayush2.out
> localhost: Exception in thread "main"
> java.lang.NullPointerException
> localhost: at
> org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:119)
> localhost: at
> org.apache.hadoop.dfs.SecondaryNameNode.<init>(SecondaryNameNode.java:118)
> localhost: at
> org.apache.hadoop.dfs.SecondaryNameNode.main(SecondaryNameNode.java:495)
> starting jobtracker, logging to
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-jobtracker-ayush2.out
> ssarkar@localhost's password:
> localhost: starting tasktracker, logging to
> /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-tasktracker-ayush2.out
> [ssarkar@ayush2 hadoop-0.17.0]$
> ===================================================================
>
> Next I loaded hive project into eclipse following steps in
> hive wiki.
> I tried Run->Run Configurations->JUnit and selected
> TestTruncate to run but got the following error ::
>
> "Errors exist in required Projest(s):
>
> hive
>
> Proceed with Launch ?"
>
> When I launch I got following errors ::
>
> =================================================================
> 09/02/03 14:01:33 INFO metastore.HiveMetaStore: 0: Opening
> raw store with implemenation
> class:org.apache.hadoop.hive.metastore.ObjectStore
> 09/02/03 14:01:33 INFO metastore.ObjectStore: ObjectStore,
> initialize called
> 09/02/03 14:01:33 INFO metastore.ObjectStore: found
> resource jpox.properties at
> file:/home/ssarkar/hive/conf/jpox.properties
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.views" not registered, but plugin
> "org.eclipse.jdt.junit" defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.perspectiveExtensions" not
> registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.preferencePages" not registered,
> but plugin "org.eclipse.jdt.junit" defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.keywords" not registered, but
> plugin "org.eclipse.jdt.junit" defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.debug.core.launchConfigurationTypes"
> not registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.debug.core.launchConfigurationComparators"
> not registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.debug.ui.launchConfigurationTypeImages"
> not registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.debug.ui.launchConfigurationTabGroups"
> not registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.newWizards" not registered, but
> plugin "org.eclipse.jdt.junit" defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.popupMenus" not registered, but
> plugin "org.eclipse.jdt.junit" defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.actionSets" not registered, but
> plugin "org.eclipse.jdt.junit" defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.actionSetPartAssociations" not
> registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.debug.ui.launchShortcuts" not
> registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.jdt.core.classpathVariableInitializer"
> not registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.jdt.ui.quickFixProcessors" not
> registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.jdt.ui.classpathFixProcessors" not
> registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.ide.markerResolution" not
> registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.core.expressions.propertyTesters" not
> registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ltk.core.refactoring.renameParticipants"
> not registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.commands" not registered, but
> plugin "org.eclipse.jdt.junit" defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.ui.bindings" not registered, but
> plugin "org.eclipse.jdt.junit" defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.core.runtime.preferences" not
> registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.jdt.core.classpathContainerInitializer"
> not registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point
> "org.eclipse.jdt.ui.classpathContainerPage" not
> registered, but plugin "org.eclipse.jdt.junit"
> defined in
> file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF
> refers to it.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.ui.ide" but it cannot be resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.ui.views" but it cannot be resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.jface.text" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.ui.workbench.texteditor" but it
> cannot be resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.ui.editors" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.ui" but it cannot be resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.core.expressions" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.core.resources" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.debug.core" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.debug.ui" but it cannot be resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.jdt.core" but it cannot be resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.jdt.ui" but it cannot be resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.core.runtime" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.jdt.launching" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.jdt.debug.ui" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.compare" but it cannot be resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.ltk.core.refactoring" but it cannot
> be resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.core.variables" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit" requires
> "org.eclipse.ltk.ui.refactoring" but it cannot be
> resolved.
> 09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle
> "org.eclipse.jdt.junit.runtime" requires
> "org.junit" but it cannot be resolved.
> 09/02/03 14:01:33 WARN JPOX.Plugin: Bundle
> "org.jpox" has an optional dependency to
> "org.eclipse.equinox.registry" but it cannot be
> resolved
> 09/02/03 14:01:33 WARN JPOX.Plugin: Bundle
> "org.jpox" has an optional dependency to
> "org.eclipse.core.runtime" but it cannot be
> resolved
> 09/02/03 14:01:33 INFO JPOX.Persistence: =================
> Persistence Configuration ===============
> 09/02/03 14:01:33 INFO JPOX.Persistence: JPOX Persistence
> Factory - Vendor: "JPOX" Version:
> "1.2.2"
> 09/02/03 14:01:33 INFO JPOX.Persistence: JPOX Persistence
> Factory initialised for datastore
> URL="jdbc:derby:;databaseName=../build/test/junit_metastore_db;create=true"
> driver="org.apache.derby.jdbc.EmbeddedDriver"
> userName="APP"
> 09/02/03 14:01:33 INFO JPOX.Persistence:
> ===========================================================
> 09/02/03 14:01:35 INFO Datastore.Schema: Initialising
> Catalog "", Schema "APP" using
> "SchemaTable" auto-start option
> 09/02/03 14:01:36 INFO Datastore.Schema: Catalog
> "", Schema "APP" initialised - managing
> 0 classes
> 09/02/03 14:01:36 INFO JPOX.JDO: >> Found
> StoreManager org.jpox.store.rdbms.RDBMSManager
> java.lang.UnsupportedClassVersionError: Bad version number
> in .class file
> at java.lang.ClassLoader.defineClass1(Native
> Method)
> at
> java.lang.ClassLoader.defineClass(ClassLoader.java:620)
> at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
> at
> java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
> at
> java.net.URLClassLoader.access$100(URLClassLoader.java:56)
> at
> java.net.URLClassLoader$1.run(URLClassLoader.java:195)
> at
> java.security.AccessController.doPrivileged(Native Method)
> at
> java.net.URLClassLoader.findClass(URLClassLoader.java:188)
> at
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
> at
> java.lang.ClassLoader.loadClass(ClassLoader.java:251)
> at
> java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:180)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:194)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:124)
> at
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:103)
> at
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:54)
> at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:82)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:127)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:143)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:115)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:100)
> at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:73)
> at
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:785)
> at
> org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:798)
> at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:316)
> at
> org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:300)
> at
> org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:105)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> at
> java.lang.reflect.Constructor.newInstance(Constructor.java:494)
> at
> junit.framework.TestSuite.createTest(TestSuite.java:131)
> at
> junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
> at
> junit.framework.TestSuite.<init>(TestSuite.java:75)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3TestLoader.java:102)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit3TestLoader.java:59)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:445)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> java.lang.ExceptionInInitializerError
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> at
> java.lang.reflect.Constructor.newInstance(Constructor.java:494)
> at
> junit.framework.TestSuite.createTest(TestSuite.java:131)
> at
> junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
> at
> junit.framework.TestSuite.<init>(TestSuite.java:75)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3TestLoader.java:102)
> at
> org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit3TestLoader.java:59)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:445)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
> at
> org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
> Caused by: java.lang.RuntimeException: Encountered
> throwable
> at
> org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:113)
> ... 13 more
> ======================================================================
>
> regards,
> Shyam
>
>
>
>
>
> --- On Tue, 2/3/09, Ashish Thusoo
> <at...@facebook.com> wrote:
>
> > From: Ashish Thusoo <at...@facebook.com>
> > Subject: RE: Eclipse run fails !!
> > To: "Shyam Sarkar"
> <sh...@yahoo.com>,
> > "hive-dev@hadoop.apache.org"
> <hi...@hadoop.apache.org>
> > Date: Tuesday, February 3, 2009, 1:46 PM Hi Shyam,
> >
> > I can certainly say that 0.17.0 should work with
> eclipse. I have been
> > doing that for a while.
> >
> > Maybe we can concentrate on fixing why you are not
> able to create a
> > table in hdfs. I am not sure why you could not create
> the
> > /user/hive/warehouse directory in 0.17. Are you saying
> that
> >
> > hadoop dfs -mkdir /user/facebook/hive
> >
> > does not work for you? Can you send out the output
> when you run this
> > command.
> >
> > Ashish
> >
> > PS: using -Dhadoop.versoion="0.17.0" for all
> the commands that are
> > given in the wiki should make things work in eclipse.
> >
> > -----Original Message-----
> > From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
> > Sent: Tuesday, February 03, 2009 12:00 PM
> > To: hive-dev@hadoop.apache.org; Ashish Thusoo
> > Subject: RE: Eclipse run fails !!
> >
> > Dear Ashish,
> >
> > For the last few days I tried eclipse 3.4.1 with
> 0.17.2.1 version and
> > got the same errors with run->run. Then I looked
> into bin/hive command
> > and found that it could not create table in HDFS. The
> reason was that
> > I could not create /user/hive/warehouse directory
> inside HDFS. It was
> > using Linux FS.
> > This is why I switched to 0.19.0 where directories in
> HDFS can be
> > created.
> >
> > Could you please tell me which exact version of hadoop
> will work fine
> > with eclipse runs ? I want to get rid of errors in
> project itself
> > (before any run).
> >
> > Regards,
> > Shyam
> >
> > --- On Tue, 2/3/09, Ashish Thusoo
> > <at...@facebook.com> wrote:
> >
> > > From: Ashish Thusoo <at...@facebook.com>
> > > Subject: RE: Eclipse run fails !!
> > > To: "hive-dev@hadoop.apache.org"
> > <hi...@hadoop.apache.org>,
> > > "shyam_sarkar@yahoo.com"
> > <sh...@yahoo.com>
> > > Date: Tuesday, February 3, 2009, 11:38 AM Hi
> Shyam,
> > >
> > > We have not really tried the eclipse stuff for
> 0.19.0.
> > Is it possible
> > > for you to use 0.17.0 for now, while we figure
> this
> > out...
> > >
> > > Ashish
> > >
> > > -----Original Message-----
> > > From: Shyam Sarkar
> [mailto:shyam_sarkar@yahoo.com]
> > > Sent: Tuesday, February 03, 2009 11:26 AM
> > > To: hive-dev@hadoop.apache.org
> > > Subject: Eclipse run fails !!
> > >
> > > Hello,
> > >
> > > I have hive project loaded inside eclipse 3.4.1
> and
> > hadoop 0.19.0 is
> > > running in the background. I could create tables
> from
> > bin/hive
> > > command.
> > > But when I try to run->run inside eclipse it
> says::
> > >
> > > "Errors exist with required project(s):
> > >
> > > hive
> > >
> > > Proceed with launch ?"
> > >
> > > and then it gives many errors.
> > >
> > > Can someone please tell me why there are errors
> in
> > project hive ? I
> > > followed all steps correctly from hive wiki.
> > >
> > > Regards,
> > > shyam_sarkar@yahoo.com
Re: Eclipse run fails !!
Posted by Prasad Chakka <pr...@facebook.com>.
I think there are multiple issues. Please do the following
1. 'ant clean' in hive directory
2. delete project in eclipse
3. Don't change any config values in hive-site.xml (revert your changes to fs.default.name etc) and don't start HDFS cluster since in unit tests we are working on local file system.
4. check what java version is 1.6
5. Follow the steps in the hive eclipse setup wiki with -Dhadoop.version=0.17.2.1
6. Open Eclipse and import the project
7. Open project preferences and make sure that it is using java 6. If it is not then change it to use java6 (let me know if you need help here). If you change it then make sure that you rebuild the project by doing a clean
8. Make sure that there are no compilation problems for the hive project (check 'problems' tab in the bottom panel of Eclipse)
9. Run the Junit test case. It should run without any warning dialogs
Let me know which of these steps fail and what is the error. You need not change any files run a junit testcase. Once you are at this point, we can help you in setting up command shell that talks to DFS.
Prasad
________________________________
From: Ashish Thusoo <at...@facebook.com>
Reply-To: <hi...@hadoop.apache.org>
Date: Tue, 3 Feb 2009 14:41:12 -0800
To: <sh...@yahoo.com>, <hi...@hadoop.apache.org>
Subject: RE: Eclipse run fails !!
Actually for running hive through eclipse you don't need to download and start hadoop. Hive tests automatically create a local instance of hdfs and map/reduce and are able to run it.
The errors that you are getting seem to indicate some jpox plugins missing in eclipse. Prasad is an expert in that area and can perhaps comment on that...
Ashish
-----Original Message-----
From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
Sent: Tuesday, February 03, 2009 2:30 PM
To: hive-dev@hadoop.apache.org; Ashish Thusoo
Subject: RE: Eclipse run fails !!
Dear Ashish,
I downloaded hadoop 0.17.0 and tried bin/start-all.sh script. I got one error ::
==============================================================
[ssarkar@ayush2 hadoop-0.17.0]$ bin/start-all.sh starting namenode, logging to /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-namenode-ayush2.out
ssarkar@localhost's password:
localhost: starting datanode, logging to /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-datanode-ayush2.out
ssarkar@localhost's password:
localhost: starting secondarynamenode, logging to /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-secondarynamenode-ayush2.out
localhost: Exception in thread "main" java.lang.NullPointerException
localhost: at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:119)
localhost: at org.apache.hadoop.dfs.SecondaryNameNode.<init>(SecondaryNameNode.java:118)
localhost: at org.apache.hadoop.dfs.SecondaryNameNode.main(SecondaryNameNode.java:495)
starting jobtracker, logging to /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-jobtracker-ayush2.out
ssarkar@localhost's password:
localhost: starting tasktracker, logging to /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-tasktracker-ayush2.out
[ssarkar@ayush2 hadoop-0.17.0]$
===================================================================
Next I loaded hive project into eclipse following steps in hive wiki.
I tried Run->Run Configurations->JUnit and selected TestTruncate to run but got the following error ::
"Errors exist in required Projest(s):
hive
Proceed with Launch ?"
When I launch I got following errors ::
=================================================================
09/02/03 14:01:33 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
09/02/03 14:01:33 INFO metastore.ObjectStore: ObjectStore, initialize called
09/02/03 14:01:33 INFO metastore.ObjectStore: found resource jpox.properties at file:/home/ssarkar/hive/conf/jpox.properties
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.views" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.perspectiveExtensions" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.preferencePages" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.keywords" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.debug.core.launchConfigurationTypes" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.debug.core.launchConfigurationComparators" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.debug.ui.launchConfigurationTypeImages" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.debug.ui.launchConfigurationTabGroups" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.newWizards" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.popupMenus" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.actionSets" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.actionSetPartAssociations" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.debug.ui.launchShortcuts" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.jdt.core.classpathVariableInitializer" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.jdt.ui.quickFixProcessors" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.jdt.ui.classpathFixProcessors" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.ide.markerResolution" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.core.expressions.propertyTesters" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ltk.core.refactoring.renameParticipants" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.commands" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.bindings" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.core.runtime.preferences" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.jdt.core.classpathContainerInitializer" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.jdt.ui.classpathContainerPage" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ui.ide" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ui.views" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.jface.text" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ui.workbench.texteditor" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ui.editors" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ui" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.core.expressions" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.core.resources" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.debug.core" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.debug.ui" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.jdt.core" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.jdt.ui" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.core.runtime" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.jdt.launching" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.jdt.debug.ui" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.compare" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ltk.core.refactoring" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.core.variables" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ltk.ui.refactoring" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit.runtime" requires "org.junit" but it cannot be resolved.
09/02/03 14:01:33 WARN JPOX.Plugin: Bundle "org.jpox" has an optional dependency to "org.eclipse.equinox.registry" but it cannot be resolved
09/02/03 14:01:33 WARN JPOX.Plugin: Bundle "org.jpox" has an optional dependency to "org.eclipse.core.runtime" but it cannot be resolved
09/02/03 14:01:33 INFO JPOX.Persistence: ================= Persistence Configuration ===============
09/02/03 14:01:33 INFO JPOX.Persistence: JPOX Persistence Factory - Vendor: "JPOX" Version: "1.2.2"
09/02/03 14:01:33 INFO JPOX.Persistence: JPOX Persistence Factory initialised for datastore URL="jdbc:derby:;databaseName=../build/test/junit_metastore_db;create=true" driver="org.apache.derby.jdbc.EmbeddedDriver" userName="APP"
09/02/03 14:01:33 INFO JPOX.Persistence: ===========================================================
09/02/03 14:01:35 INFO Datastore.Schema: Initialising Catalog "", Schema "APP" using "SchemaTable" auto-start option
09/02/03 14:01:36 INFO Datastore.Schema: Catalog "", Schema "APP" initialised - managing 0 classes
09/02/03 14:01:36 INFO JPOX.JDO: >> Found StoreManager org.jpox.store.rdbms.RDBMSManager
java.lang.UnsupportedClassVersionError: Bad version number in .class file
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:620)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
at java.net.URLClassLoader.access$100(URLClassLoader.java:56)
at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:180)
at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:194)
at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:124)
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:103)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:54)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:82)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:127)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:143)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:115)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:100)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:73)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:785)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:798)
at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:316)
at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:300)
at org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:105)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:494)
at junit.framework.TestSuite.createTest(TestSuite.java:131)
at junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
at junit.framework.TestSuite.<init>(TestSuite.java:75)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3TestLoader.java:102)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit3TestLoader.java:59)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:445)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
java.lang.ExceptionInInitializerError
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:494)
at junit.framework.TestSuite.createTest(TestSuite.java:131)
at junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
at junit.framework.TestSuite.<init>(TestSuite.java:75)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3TestLoader.java:102)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit3TestLoader.java:59)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:445)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Caused by: java.lang.RuntimeException: Encountered throwable
at org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:113)
... 13 more
======================================================================
regards,
Shyam
--- On Tue, 2/3/09, Ashish Thusoo <at...@facebook.com> wrote:
> From: Ashish Thusoo <at...@facebook.com>
> Subject: RE: Eclipse run fails !!
> To: "Shyam Sarkar" <sh...@yahoo.com>,
> "hive-dev@hadoop.apache.org" <hi...@hadoop.apache.org>
> Date: Tuesday, February 3, 2009, 1:46 PM Hi Shyam,
>
> I can certainly say that 0.17.0 should work with eclipse. I have been
> doing that for a while.
>
> Maybe we can concentrate on fixing why you are not able to create a
> table in hdfs. I am not sure why you could not create the
> /user/hive/warehouse directory in 0.17. Are you saying that
>
> hadoop dfs -mkdir /user/facebook/hive
>
> does not work for you? Can you send out the output when you run this
> command.
>
> Ashish
>
> PS: using -Dhadoop.versoion="0.17.0" for all the commands that are
> given in the wiki should make things work in eclipse.
>
> -----Original Message-----
> From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
> Sent: Tuesday, February 03, 2009 12:00 PM
> To: hive-dev@hadoop.apache.org; Ashish Thusoo
> Subject: RE: Eclipse run fails !!
>
> Dear Ashish,
>
> For the last few days I tried eclipse 3.4.1 with 0.17.2.1 version and
> got the same errors with run->run. Then I looked into bin/hive command
> and found that it could not create table in HDFS. The reason was that
> I could not create /user/hive/warehouse directory inside HDFS. It was
> using Linux FS.
> This is why I switched to 0.19.0 where directories in HDFS can be
> created.
>
> Could you please tell me which exact version of hadoop will work fine
> with eclipse runs ? I want to get rid of errors in project itself
> (before any run).
>
> Regards,
> Shyam
>
> --- On Tue, 2/3/09, Ashish Thusoo
> <at...@facebook.com> wrote:
>
> > From: Ashish Thusoo <at...@facebook.com>
> > Subject: RE: Eclipse run fails !!
> > To: "hive-dev@hadoop.apache.org"
> <hi...@hadoop.apache.org>,
> > "shyam_sarkar@yahoo.com"
> <sh...@yahoo.com>
> > Date: Tuesday, February 3, 2009, 11:38 AM Hi Shyam,
> >
> > We have not really tried the eclipse stuff for 0.19.0.
> Is it possible
> > for you to use 0.17.0 for now, while we figure this
> out...
> >
> > Ashish
> >
> > -----Original Message-----
> > From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
> > Sent: Tuesday, February 03, 2009 11:26 AM
> > To: hive-dev@hadoop.apache.org
> > Subject: Eclipse run fails !!
> >
> > Hello,
> >
> > I have hive project loaded inside eclipse 3.4.1 and
> hadoop 0.19.0 is
> > running in the background. I could create tables from
> bin/hive
> > command.
> > But when I try to run->run inside eclipse it says::
> >
> > "Errors exist with required project(s):
> >
> > hive
> >
> > Proceed with launch ?"
> >
> > and then it gives many errors.
> >
> > Can someone please tell me why there are errors in
> project hive ? I
> > followed all steps correctly from hive wiki.
> >
> > Regards,
> > shyam_sarkar@yahoo.com
RE: Eclipse run fails !!
Posted by Ashish Thusoo <at...@facebook.com>.
Actually for running hive through eclipse you don't need to download and start hadoop. Hive tests automatically create a local instance of hdfs and map/reduce and are able to run it.
The errors that you are getting seem to indicate some jpox plugins missing in eclipse. Prasad is an expert in that area and can perhaps comment on that...
Ashish
-----Original Message-----
From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
Sent: Tuesday, February 03, 2009 2:30 PM
To: hive-dev@hadoop.apache.org; Ashish Thusoo
Subject: RE: Eclipse run fails !!
Dear Ashish,
I downloaded hadoop 0.17.0 and tried bin/start-all.sh script. I got one error ::
==============================================================
[ssarkar@ayush2 hadoop-0.17.0]$ bin/start-all.sh starting namenode, logging to /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-namenode-ayush2.out
ssarkar@localhost's password:
localhost: starting datanode, logging to /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-datanode-ayush2.out
ssarkar@localhost's password:
localhost: starting secondarynamenode, logging to /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-secondarynamenode-ayush2.out
localhost: Exception in thread "main" java.lang.NullPointerException
localhost: at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:119)
localhost: at org.apache.hadoop.dfs.SecondaryNameNode.<init>(SecondaryNameNode.java:118)
localhost: at org.apache.hadoop.dfs.SecondaryNameNode.main(SecondaryNameNode.java:495)
starting jobtracker, logging to /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-jobtracker-ayush2.out
ssarkar@localhost's password:
localhost: starting tasktracker, logging to /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-tasktracker-ayush2.out
[ssarkar@ayush2 hadoop-0.17.0]$
===================================================================
Next I loaded hive project into eclipse following steps in hive wiki.
I tried Run->Run Configurations->JUnit and selected TestTruncate to run but got the following error ::
"Errors exist in required Projest(s):
hive
Proceed with Launch ?"
When I launch I got following errors ::
=================================================================
09/02/03 14:01:33 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
09/02/03 14:01:33 INFO metastore.ObjectStore: ObjectStore, initialize called
09/02/03 14:01:33 INFO metastore.ObjectStore: found resource jpox.properties at file:/home/ssarkar/hive/conf/jpox.properties
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.views" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.perspectiveExtensions" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.preferencePages" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.keywords" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.debug.core.launchConfigurationTypes" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.debug.core.launchConfigurationComparators" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.debug.ui.launchConfigurationTypeImages" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.debug.ui.launchConfigurationTabGroups" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.newWizards" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.popupMenus" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.actionSets" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.actionSetPartAssociations" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.debug.ui.launchShortcuts" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.jdt.core.classpathVariableInitializer" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.jdt.ui.quickFixProcessors" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.jdt.ui.classpathFixProcessors" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.ide.markerResolution" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.core.expressions.propertyTesters" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ltk.core.refactoring.renameParticipants" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.commands" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.bindings" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.core.runtime.preferences" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.jdt.core.classpathContainerInitializer" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.jdt.ui.classpathContainerPage" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ui.ide" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ui.views" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.jface.text" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ui.workbench.texteditor" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ui.editors" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ui" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.core.expressions" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.core.resources" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.debug.core" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.debug.ui" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.jdt.core" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.jdt.ui" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.core.runtime" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.jdt.launching" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.jdt.debug.ui" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.compare" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ltk.core.refactoring" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.core.variables" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ltk.ui.refactoring" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit.runtime" requires "org.junit" but it cannot be resolved.
09/02/03 14:01:33 WARN JPOX.Plugin: Bundle "org.jpox" has an optional dependency to "org.eclipse.equinox.registry" but it cannot be resolved
09/02/03 14:01:33 WARN JPOX.Plugin: Bundle "org.jpox" has an optional dependency to "org.eclipse.core.runtime" but it cannot be resolved
09/02/03 14:01:33 INFO JPOX.Persistence: ================= Persistence Configuration ===============
09/02/03 14:01:33 INFO JPOX.Persistence: JPOX Persistence Factory - Vendor: "JPOX" Version: "1.2.2"
09/02/03 14:01:33 INFO JPOX.Persistence: JPOX Persistence Factory initialised for datastore URL="jdbc:derby:;databaseName=../build/test/junit_metastore_db;create=true" driver="org.apache.derby.jdbc.EmbeddedDriver" userName="APP"
09/02/03 14:01:33 INFO JPOX.Persistence: ===========================================================
09/02/03 14:01:35 INFO Datastore.Schema: Initialising Catalog "", Schema "APP" using "SchemaTable" auto-start option
09/02/03 14:01:36 INFO Datastore.Schema: Catalog "", Schema "APP" initialised - managing 0 classes
09/02/03 14:01:36 INFO JPOX.JDO: >> Found StoreManager org.jpox.store.rdbms.RDBMSManager
java.lang.UnsupportedClassVersionError: Bad version number in .class file
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:620)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
at java.net.URLClassLoader.access$100(URLClassLoader.java:56)
at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:180)
at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:194)
at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:124)
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:103)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:54)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:82)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:127)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:143)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:115)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:100)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:73)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:785)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:798)
at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:316)
at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:300)
at org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:105)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:494)
at junit.framework.TestSuite.createTest(TestSuite.java:131)
at junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
at junit.framework.TestSuite.<init>(TestSuite.java:75)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3TestLoader.java:102)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit3TestLoader.java:59)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:445)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
java.lang.ExceptionInInitializerError
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:494)
at junit.framework.TestSuite.createTest(TestSuite.java:131)
at junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
at junit.framework.TestSuite.<init>(TestSuite.java:75)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3TestLoader.java:102)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit3TestLoader.java:59)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:445)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Caused by: java.lang.RuntimeException: Encountered throwable
at org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:113)
... 13 more
======================================================================
regards,
Shyam
--- On Tue, 2/3/09, Ashish Thusoo <at...@facebook.com> wrote:
> From: Ashish Thusoo <at...@facebook.com>
> Subject: RE: Eclipse run fails !!
> To: "Shyam Sarkar" <sh...@yahoo.com>,
> "hive-dev@hadoop.apache.org" <hi...@hadoop.apache.org>
> Date: Tuesday, February 3, 2009, 1:46 PM Hi Shyam,
>
> I can certainly say that 0.17.0 should work with eclipse. I have been
> doing that for a while.
>
> Maybe we can concentrate on fixing why you are not able to create a
> table in hdfs. I am not sure why you could not create the
> /user/hive/warehouse directory in 0.17. Are you saying that
>
> hadoop dfs -mkdir /user/facebook/hive
>
> does not work for you? Can you send out the output when you run this
> command.
>
> Ashish
>
> PS: using -Dhadoop.versoion="0.17.0" for all the commands that are
> given in the wiki should make things work in eclipse.
>
> -----Original Message-----
> From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
> Sent: Tuesday, February 03, 2009 12:00 PM
> To: hive-dev@hadoop.apache.org; Ashish Thusoo
> Subject: RE: Eclipse run fails !!
>
> Dear Ashish,
>
> For the last few days I tried eclipse 3.4.1 with 0.17.2.1 version and
> got the same errors with run->run. Then I looked into bin/hive command
> and found that it could not create table in HDFS. The reason was that
> I could not create /user/hive/warehouse directory inside HDFS. It was
> using Linux FS.
> This is why I switched to 0.19.0 where directories in HDFS can be
> created.
>
> Could you please tell me which exact version of hadoop will work fine
> with eclipse runs ? I want to get rid of errors in project itself
> (before any run).
>
> Regards,
> Shyam
>
> --- On Tue, 2/3/09, Ashish Thusoo
> <at...@facebook.com> wrote:
>
> > From: Ashish Thusoo <at...@facebook.com>
> > Subject: RE: Eclipse run fails !!
> > To: "hive-dev@hadoop.apache.org"
> <hi...@hadoop.apache.org>,
> > "shyam_sarkar@yahoo.com"
> <sh...@yahoo.com>
> > Date: Tuesday, February 3, 2009, 11:38 AM Hi Shyam,
> >
> > We have not really tried the eclipse stuff for 0.19.0.
> Is it possible
> > for you to use 0.17.0 for now, while we figure this
> out...
> >
> > Ashish
> >
> > -----Original Message-----
> > From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
> > Sent: Tuesday, February 03, 2009 11:26 AM
> > To: hive-dev@hadoop.apache.org
> > Subject: Eclipse run fails !!
> >
> > Hello,
> >
> > I have hive project loaded inside eclipse 3.4.1 and
> hadoop 0.19.0 is
> > running in the background. I could create tables from
> bin/hive
> > command.
> > But when I try to run->run inside eclipse it says::
> >
> > "Errors exist with required project(s):
> >
> > hive
> >
> > Proceed with launch ?"
> >
> > and then it gives many errors.
> >
> > Can someone please tell me why there are errors in
> project hive ? I
> > followed all steps correctly from hive wiki.
> >
> > Regards,
> > shyam_sarkar@yahoo.com
RE: Eclipse run fails !!
Posted by Shyam Sarkar <sh...@yahoo.com>.
Dear Ashish,
I downloaded hadoop 0.17.0 and tried bin/start-all.sh script. I got one error ::
==============================================================
[ssarkar@ayush2 hadoop-0.17.0]$ bin/start-all.sh
starting namenode, logging to /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-namenode-ayush2.out
ssarkar@localhost's password:
localhost: starting datanode, logging to /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-datanode-ayush2.out
ssarkar@localhost's password:
localhost: starting secondarynamenode, logging to /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-secondarynamenode-ayush2.out
localhost: Exception in thread "main" java.lang.NullPointerException
localhost: at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:119)
localhost: at org.apache.hadoop.dfs.SecondaryNameNode.<init>(SecondaryNameNode.java:118)
localhost: at org.apache.hadoop.dfs.SecondaryNameNode.main(SecondaryNameNode.java:495)
starting jobtracker, logging to /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-jobtracker-ayush2.out
ssarkar@localhost's password:
localhost: starting tasktracker, logging to /home/ssarkar/hadoop/hadoop-0.17.0/bin/../logs/hadoop-ssarkar-tasktracker-ayush2.out
[ssarkar@ayush2 hadoop-0.17.0]$
===================================================================
Next I loaded hive project into eclipse following steps in hive wiki.
I tried Run->Run Configurations->JUnit and selected TestTruncate to run
but got the following error ::
"Errors exist in required Projest(s):
hive
Proceed with Launch ?"
When I launch I got following errors ::
=================================================================
09/02/03 14:01:33 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
09/02/03 14:01:33 INFO metastore.ObjectStore: ObjectStore, initialize called
09/02/03 14:01:33 INFO metastore.ObjectStore: found resource jpox.properties at file:/home/ssarkar/hive/conf/jpox.properties
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.views" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.perspectiveExtensions" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.preferencePages" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.keywords" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.debug.core.launchConfigurationTypes" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.debug.core.launchConfigurationComparators" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.debug.ui.launchConfigurationTypeImages" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.debug.ui.launchConfigurationTabGroups" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.newWizards" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.popupMenus" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.actionSets" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.actionSetPartAssociations" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.debug.ui.launchShortcuts" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.jdt.core.classpathVariableInitializer" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.jdt.ui.quickFixProcessors" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.jdt.ui.classpathFixProcessors" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.ide.markerResolution" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.core.expressions.propertyTesters" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ltk.core.refactoring.renameParticipants" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.commands" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.ui.bindings" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.core.runtime.preferences" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.jdt.core.classpathContainerInitializer" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 WARN JPOX.Plugin: Extension Point "org.eclipse.jdt.ui.classpathContainerPage" not registered, but plugin "org.eclipse.jdt.junit" defined in file:/home/ssarkar/eclipse/eclipse32_3.4.1_working/configuration/org.eclipse.osgi/bundles/97/1/.cp/META-INF/MANIFEST.MF refers to it.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ui.ide" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ui.views" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.jface.text" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ui.workbench.texteditor" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ui.editors" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ui" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.core.expressions" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.core.resources" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.debug.core" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.debug.ui" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.jdt.core" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.jdt.ui" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.core.runtime" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.jdt.launching" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.jdt.debug.ui" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.compare" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ltk.core.refactoring" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.core.variables" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit" requires "org.eclipse.ltk.ui.refactoring" but it cannot be resolved.
09/02/03 14:01:33 ERROR JPOX.Plugin: Bundle "org.eclipse.jdt.junit.runtime" requires "org.junit" but it cannot be resolved.
09/02/03 14:01:33 WARN JPOX.Plugin: Bundle "org.jpox" has an optional dependency to "org.eclipse.equinox.registry" but it cannot be resolved
09/02/03 14:01:33 WARN JPOX.Plugin: Bundle "org.jpox" has an optional dependency to "org.eclipse.core.runtime" but it cannot be resolved
09/02/03 14:01:33 INFO JPOX.Persistence: ================= Persistence Configuration ===============
09/02/03 14:01:33 INFO JPOX.Persistence: JPOX Persistence Factory - Vendor: "JPOX" Version: "1.2.2"
09/02/03 14:01:33 INFO JPOX.Persistence: JPOX Persistence Factory initialised for datastore URL="jdbc:derby:;databaseName=../build/test/junit_metastore_db;create=true" driver="org.apache.derby.jdbc.EmbeddedDriver" userName="APP"
09/02/03 14:01:33 INFO JPOX.Persistence: ===========================================================
09/02/03 14:01:35 INFO Datastore.Schema: Initialising Catalog "", Schema "APP" using "SchemaTable" auto-start option
09/02/03 14:01:36 INFO Datastore.Schema: Catalog "", Schema "APP" initialised - managing 0 classes
09/02/03 14:01:36 INFO JPOX.JDO: >> Found StoreManager org.jpox.store.rdbms.RDBMSManager
java.lang.UnsupportedClassVersionError: Bad version number in .class file
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:620)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
at java.net.URLClassLoader.access$100(URLClassLoader.java:56)
at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:180)
at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:194)
at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:124)
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:103)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:54)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:82)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:127)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:143)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:115)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:100)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:73)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:785)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:798)
at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:316)
at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:300)
at org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:105)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:494)
at junit.framework.TestSuite.createTest(TestSuite.java:131)
at junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
at junit.framework.TestSuite.<init>(TestSuite.java:75)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3TestLoader.java:102)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit3TestLoader.java:59)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:445)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
java.lang.ExceptionInInitializerError
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:494)
at junit.framework.TestSuite.createTest(TestSuite.java:131)
at junit.framework.TestSuite.addTestMethod(TestSuite.java:114)
at junit.framework.TestSuite.<init>(TestSuite.java:75)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.getTest(JUnit3TestLoader.java:102)
at org.eclipse.jdt.internal.junit.runner.junit3.JUnit3TestLoader.loadTests(JUnit3TestLoader.java:59)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:445)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:673)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:386)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:196)
Caused by: java.lang.RuntimeException: Encountered throwable
at org.apache.hadoop.hive.ql.exec.TestExecDriver.<clinit>(TestExecDriver.java:113)
... 13 more
======================================================================
regards,
Shyam
--- On Tue, 2/3/09, Ashish Thusoo <at...@facebook.com> wrote:
> From: Ashish Thusoo <at...@facebook.com>
> Subject: RE: Eclipse run fails !!
> To: "Shyam Sarkar" <sh...@yahoo.com>, "hive-dev@hadoop.apache.org" <hi...@hadoop.apache.org>
> Date: Tuesday, February 3, 2009, 1:46 PM
> Hi Shyam,
>
> I can certainly say that 0.17.0 should work with eclipse. I
> have been doing that for a while.
>
> Maybe we can concentrate on fixing why you are not able to
> create a table in hdfs. I am not sure why you could not
> create the /user/hive/warehouse directory in 0.17. Are you
> saying that
>
> hadoop dfs -mkdir /user/facebook/hive
>
> does not work for you? Can you send out the output when you
> run this command.
>
> Ashish
>
> PS: using -Dhadoop.versoion="0.17.0" for all the
> commands that are given in the wiki should make things work
> in eclipse.
>
> -----Original Message-----
> From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
> Sent: Tuesday, February 03, 2009 12:00 PM
> To: hive-dev@hadoop.apache.org; Ashish Thusoo
> Subject: RE: Eclipse run fails !!
>
> Dear Ashish,
>
> For the last few days I tried eclipse 3.4.1 with 0.17.2.1
> version and got the same errors with run->run. Then I
> looked into bin/hive command and found that it could not
> create table in HDFS. The reason was that I could not create
> /user/hive/warehouse directory inside HDFS. It was using
> Linux FS.
> This is why I switched to 0.19.0 where directories in HDFS
> can be created.
>
> Could you please tell me which exact version of hadoop will
> work fine with eclipse runs ? I want to get rid of errors
> in project itself (before any run).
>
> Regards,
> Shyam
>
> --- On Tue, 2/3/09, Ashish Thusoo
> <at...@facebook.com> wrote:
>
> > From: Ashish Thusoo <at...@facebook.com>
> > Subject: RE: Eclipse run fails !!
> > To: "hive-dev@hadoop.apache.org"
> <hi...@hadoop.apache.org>,
> > "shyam_sarkar@yahoo.com"
> <sh...@yahoo.com>
> > Date: Tuesday, February 3, 2009, 11:38 AM Hi Shyam,
> >
> > We have not really tried the eclipse stuff for 0.19.0.
> Is it possible
> > for you to use 0.17.0 for now, while we figure this
> out...
> >
> > Ashish
> >
> > -----Original Message-----
> > From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
> > Sent: Tuesday, February 03, 2009 11:26 AM
> > To: hive-dev@hadoop.apache.org
> > Subject: Eclipse run fails !!
> >
> > Hello,
> >
> > I have hive project loaded inside eclipse 3.4.1 and
> hadoop 0.19.0 is
> > running in the background. I could create tables from
> bin/hive
> > command.
> > But when I try to run->run inside eclipse it says::
> >
> > "Errors exist with required project(s):
> >
> > hive
> >
> > Proceed with launch ?"
> >
> > and then it gives many errors.
> >
> > Can someone please tell me why there are errors in
> project hive ? I
> > followed all steps correctly from hive wiki.
> >
> > Regards,
> > shyam_sarkar@yahoo.com
RE: Eclipse run fails !!
Posted by Ashish Thusoo <at...@facebook.com>.
Hi Shyam,
I can certainly say that 0.17.0 should work with eclipse. I have been doing that for a while.
Maybe we can concentrate on fixing why you are not able to create a table in hdfs. I am not sure why you could not create the /user/hive/warehouse directory in 0.17. Are you saying that
hadoop dfs -mkdir /user/facebook/hive
does not work for you? Can you send out the output when you run this command.
Ashish
PS: using -Dhadoop.versoion="0.17.0" for all the commands that are given in the wiki should make things work in eclipse.
-----Original Message-----
From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
Sent: Tuesday, February 03, 2009 12:00 PM
To: hive-dev@hadoop.apache.org; Ashish Thusoo
Subject: RE: Eclipse run fails !!
Dear Ashish,
For the last few days I tried eclipse 3.4.1 with 0.17.2.1 version and got the same errors with run->run. Then I looked into bin/hive command and found that it could not create table in HDFS. The reason was that I could not create /user/hive/warehouse directory inside HDFS. It was using Linux FS.
This is why I switched to 0.19.0 where directories in HDFS can be created.
Could you please tell me which exact version of hadoop will work fine with eclipse runs ? I want to get rid of errors in project itself (before any run).
Regards,
Shyam
--- On Tue, 2/3/09, Ashish Thusoo <at...@facebook.com> wrote:
> From: Ashish Thusoo <at...@facebook.com>
> Subject: RE: Eclipse run fails !!
> To: "hive-dev@hadoop.apache.org" <hi...@hadoop.apache.org>,
> "shyam_sarkar@yahoo.com" <sh...@yahoo.com>
> Date: Tuesday, February 3, 2009, 11:38 AM Hi Shyam,
>
> We have not really tried the eclipse stuff for 0.19.0. Is it possible
> for you to use 0.17.0 for now, while we figure this out...
>
> Ashish
>
> -----Original Message-----
> From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
> Sent: Tuesday, February 03, 2009 11:26 AM
> To: hive-dev@hadoop.apache.org
> Subject: Eclipse run fails !!
>
> Hello,
>
> I have hive project loaded inside eclipse 3.4.1 and hadoop 0.19.0 is
> running in the background. I could create tables from bin/hive
> command.
> But when I try to run->run inside eclipse it says::
>
> "Errors exist with required project(s):
>
> hive
>
> Proceed with launch ?"
>
> and then it gives many errors.
>
> Can someone please tell me why there are errors in project hive ? I
> followed all steps correctly from hive wiki.
>
> Regards,
> shyam_sarkar@yahoo.com
RE: Eclipse run fails !!
Posted by Shyam Sarkar <sh...@yahoo.com>.
Dear Ashish,
For the last few days I tried eclipse 3.4.1 with 0.17.2.1 version and got the same errors with run->run. Then I looked into bin/hive command and found that it could not create table in HDFS. The reason was that I could not create /user/hive/warehouse directory inside HDFS. It was using Linux FS.
This is why I switched to 0.19.0 where directories in HDFS can be created.
Could you please tell me which exact version of hadoop will work fine with eclipse runs ? I want to get rid of errors in project itself (before any run).
Regards,
Shyam
--- On Tue, 2/3/09, Ashish Thusoo <at...@facebook.com> wrote:
> From: Ashish Thusoo <at...@facebook.com>
> Subject: RE: Eclipse run fails !!
> To: "hive-dev@hadoop.apache.org" <hi...@hadoop.apache.org>, "shyam_sarkar@yahoo.com" <sh...@yahoo.com>
> Date: Tuesday, February 3, 2009, 11:38 AM
> Hi Shyam,
>
> We have not really tried the eclipse stuff for 0.19.0. Is
> it possible for you to use 0.17.0 for now, while we figure
> this out...
>
> Ashish
>
> -----Original Message-----
> From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
> Sent: Tuesday, February 03, 2009 11:26 AM
> To: hive-dev@hadoop.apache.org
> Subject: Eclipse run fails !!
>
> Hello,
>
> I have hive project loaded inside eclipse 3.4.1 and hadoop
> 0.19.0 is running in the background. I could create tables
> from bin/hive command.
> But when I try to run->run inside eclipse it says::
>
> "Errors exist with required project(s):
>
> hive
>
> Proceed with launch ?"
>
> and then it gives many errors.
>
> Can someone please tell me why there are errors in project
> hive ? I followed all steps correctly from hive wiki.
>
> Regards,
> shyam_sarkar@yahoo.com
RE: Eclipse run fails !!
Posted by Ashish Thusoo <at...@facebook.com>.
Hi Shyam,
We have not really tried the eclipse stuff for 0.19.0. Is it possible for you to use 0.17.0 for now, while we figure this out...
Ashish
-----Original Message-----
From: Shyam Sarkar [mailto:shyam_sarkar@yahoo.com]
Sent: Tuesday, February 03, 2009 11:26 AM
To: hive-dev@hadoop.apache.org
Subject: Eclipse run fails !!
Hello,
I have hive project loaded inside eclipse 3.4.1 and hadoop 0.19.0 is running in the background. I could create tables from bin/hive command.
But when I try to run->run inside eclipse it says::
"Errors exist with required project(s):
hive
Proceed with launch ?"
and then it gives many errors.
Can someone please tell me why there are errors in project hive ? I followed all steps correctly from hive wiki.
Regards,
shyam_sarkar@yahoo.com