You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Aaron Kimball <aa...@cloudera.com> on 2009/05/14 00:59:01 UTC

hive jdbc client usage?

Hi all,

I've been trying to use the Hive JDBC client today with some frustration. My
goal was to execute a simple "SHOW TABLES" statement in Hive.

If I start the Hive server with HIVE_PORT=10000 hive --service hiveserver,
the following happens when I connect to jdbc:hive://localhost:10000/default:
java.sql.SQLException: Method not supported

If instead I attempt to connect to jdbc:hive:// (without the standalone
hiveserver started), I get:
java.sql.SQLException: MetaException(message:hive.metastore.warehouse.dir is
not set in the config or blank)

I'm confused where I should set the hive.metastore.warehouse.dir property.
I've run this from a directory containing a valid hive-site.xml; this
directory is named "conf/", so I also tried running my program in that
directory's parent, thinking it may look for conf/hive-default.xml and
conf/hive-site.xml.  How do I set the configuration files that will be
loaded inside the call to DriverManager.getConnection()? And in the case of
the standalone server, does anyone have any insight into why I'd get "method
not supported" ?

FWIW, the program I ran was invoked via 'hadoop jar ...'; I don't know if
launching a program in this way would mess up Hive's config paths, etc. This
is Hadoop 0.18.3, Hive 0.3.0.

Thanks,
- Aaron

Re: hive jdbc client usage?

Posted by Raghu Murthy <rm...@facebook.com>.
Thanks for writing the script! Please go ahead and add it to the wiki.


On 6/4/09 10:34 AM, "Bill Graham" <bi...@gmail.com> wrote:

> Thanks guys, this works now for me now. The JDOFatalInternalException was
> because the Derby jar was missing from my classpath.  Prasad, there wasn't a
> more detailed stack trace, only what I posted. I'd say that better error
> messaging could help here, but it seems like a hard exception to catch and
> correctly diagnose.
> 
> Either way, here's a script to run the client example in embedded mode that
> might be helpful to others. Should I add it to the wiki?
> 
> #!/bin/bash
> HADOOP_HOME=/path/to/hadoop
> HIVE_HOME=/path/to/dist
> 
> echo -e '1\x01foo' > /tmp/a.txt
> echo -e '2\x01bar' >> /tmp/a.txt
> 
> HADOOP_CORE=`ls $HADOOP_HOME/hadoop-*-core.jar`
> CLASSPATH=.:$HADOOP_CORE:\$HIVE_HOME/conf
> 
> for i in ${HIVE_HOME}/lib/*.jar ; do
>     CLASSPATH=$CLASSPATH:$i
> done
> 
> java -cp $CLASSPATH HiveJdbcClient
> 
> 
> 
> On Wed, Jun 3, 2009 at 1:34 PM, Raghu Murthy <rm...@facebook.com> wrote:
>> It should be a path problem. I am not sure which exact jar you are missing,
>> but by adding  all the jars in $HIVE_HOME/lib, $HIVE_HOME/conf, and
>> $HADOOP_HOME/hadoop-*-core.jar (need to set HADOOP_HOME) to the classpath I
>> was able to run the program in embedded mode.
>> 
>> 
>> On 6/3/09 1:05 PM, "Prasad Chakka" <pc...@facebook.com> wrote:
>> 
>>>> Do you have the jdbc driver in tha path? If you can paste the full chain of
>>>> exceptions, I may be able to tell you exactly what is missing.
>>>> 
>>>> Thanks,
>>>> Prasad
>>>> 
>>>> 
>>>> 
>>>> From: Bill Graham <bi...@gmail.com>
>>>> Reply-To: <hi...@hadoop.apache.org>, <bi...@gmail.com>
>>>> Date: Wed, 3 Jun 2009 12:57:38 -0700
>>>> To: <hi...@hadoop.apache.org>
>>>> Subject: Re: hive jdbc client usage?
>>>> 
>>>> This worked for me against a remote Hive server fyi, but I needed two more
>>>> jars:
>>>> 
>>>> hive_exec.jar
>>>> log4j-1.2.15
>>>> 
>>>> I then tried running it in embedded mode connecting to "jdbc:hive://" and
>>>> >> got
>>>> the same hive.metastore.warehouse.dir exception as Aaron. This seems to be
>>>> because the hive configs aren't being properly loaded. Adding
>>> $HIVE_HOME/conf
>>>> to the classpath remedied that exception, but then I needed a few more jars
>>>> from the Hive dist:
>>>> 
>>>> jdo2-api-2.1.jar
>>>> jpox-core-1.2.2.jar
>>>> jpox-rdbms-1.2.2.jar
>>>> 
>>>> This resulted in the following error:
>>>> 
>>>> 09/06/03 12:45:53 INFO JPOX.Persistence:
>>>> ===========================================================
>>>> Exception in thread "main" java.sql.SQLException:
>>>> javax.jdo.JDOFatalInternalException: Error creating transactional
>>> connection
>>>> factory
>>>> NestedThrowables:
>>>> java.lang.reflect.InvocationTargetException
>>>>         at 
>>> org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:71)
>>>>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>>>>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>>>>         at HiveJdbcClient.main(HiveJdbcClient.java:22)
>>>> 
>>>> Any ideas re how to fix this? Should it be possible to run the client in
>>>> embedded mode in this way?
>>>> 
>>>> thanks,
>>>> Bill
>>>> 
>>>> On Wed, Jun 3, 2009 at 11:48 AM, Raghu Murthy <rm...@facebook.com> wrote:
>>>>>> I have created a standalone program for hive jdbc at:
>>>>>> 
>>>> 
http://wiki.apache.org/hadoop/Hive/HiveClient#head-5b27b3a8f9f322945734f470>>>>
d
>>>>>> 1ae58f8afeaa0b4
>>>>>> 
>> <http://wiki.apache.org/hadoop/Hive/HiveClient#head-5b27b3a8f9f322945734f470d
>> >> >
>> %
>>>>>> 0A1ae58f8afeaa0b4>
>>>>>> 
>>>>>> Let me know if it works.
>>>>>> 
>>>>>> raghu
>>>>>> 
>>>>>> On 5/14/09 1:33 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:
>>>>>> 
>>>>>>>>>> I started the server with
>>>>>>>>>> aaron@jargon:~/src/ext/svn/hive-0.3.0/build/dist/bin$ HIVE_PORT=10000
>>>>>>>> ./hive
>>>>>>>>>> --service hiveserver
>>>>>>>>>> It appeared to start correctly.
>>>>>>>>>> 
>>>>>>>>>> Then ran the test using the ant command-line you gave me. It
>>>>>> connected to
>>>>>>>>>>>> the
>>>>>>>>>> hiveserver (some output appeared there, including many SqlExceptions
>>>>>>>> regarding
>>>>>>>>>> indices that already exist), but the test fails:
>>>>>>>>>> 
>>>>>>>>>> test:
>>>>>>>>>>     [junit] Running org.apache.hadoop.hive.jdbc.TestJdbcDriver
>>>>>>>>>>     [junit] Hive history
>>>>>>>>>> 
>>>>>>>> 
>>>>> file=/home/aaron/src/ext/svn/hive-0.3.0/jdbc/../build/ql/tmp/hive_job_log_
>>>>> aa
>>>>>>>> ro
>>>>>>>>>> n_200905141328_716976076.txt
>>>>>>>>>>     [junit] Tests run: 1, Failures: 1, Errors: 0, Time elapsed: 1.836
>>>>>>>>>> sec
>>>>>>>>>>     [junit] Test org.apache.hadoop.hive.jdbc.TestJdbcDriver FAILED
>>>>>>>>>> 
>>>>>>>>>> BUILD FAILED
>>>>>>>>>> /home/aaron/src/ext/svn/hive-0.3.0/build.xml:166: The following error
>>>>>>>> occurred
>>>>>>>>>> while executing this line:
>>>>>>>>>> /home/aaron/src/ext/svn/hive-0.3.0/build-common.xml:269: Tests
>>>>>> failed!
>>>>>>>>>> 
>>>>>>>>>> I've attached the test log.
>>>>>>>>>> - Aaron
>>>>>>>>>> 
>>>>>>>>>> On Wed, May 13, 2009 at 8:30 PM, Raghu Murthy <rm...@facebook.com>
>>>>>>>>>> >>>>> wrote:
>>>>>>>>>>>> Ok, one more test. Can you apply the attached patch and then run
>>>>>>>>>>>> the
>>>>>>>>>>>> following?
>>>>>>>>>>>> 
>>>>>>>>>>>> 1. rebuild
>>>>>>>>>>>> 2. from dist/bin, run hive server on localhost port 10000
>>>>>>>>>>>> 3. from trunk, run ant test -Dtestcase=TestJdbcDriver
>>>>>>>> -Dstandalone=true
>>>>>>>>>>>> 
>>>>>>>>>>>> Does the test succeed?
>>>>>>>>>>>> 
>>>>>>>>>>>> On 5/13/09 4:13 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:
>>>>>>>>>>>> 
>>>>>>>>>>>> I can in fact run the hive cli. I created a table named >>>>>>>>>>>
>>>>>>>>>>>> foo and can
>>>>>>>>>>>> describe
>>>>>>>>>>>> it, select from it, etc.
>>>>>>>>>>>> 
>>>>>>>>>>>> I also tried to run 'SELECT * FROM foo' via JDBC and that failed as
>>>>>>>>>>>> well.
>>>>>>>>>>>> - Aaron
>>>>>>>>>>>> 
>>>>>>>>>>>> On Wed, May 13, 2009 at 4:02 PM, Raghu Murthy
>>>>>>>>>>>> <rm...@facebook.com>
>>>>>>>>>>>> wrote:
>>>>>>>>>>>> Are you able to run the hive cli from the same >>>>>>>>>>>>
>>>>>>>>>>>> installation?
>>>>>>>>>>>> There are
>>>>>>>>>>>> currently some issues while running metadata-only calls (show,
>>>>>>>>>>>> describe)
>>>>>>>>>>>> via
>>>>>>>>>>>> JDBC. Regular queries should be fine though.
>>>>>>>>>>>> 
>>>>>>>>>>>> 
>>>>>>>>>>>> On 5/13/09 3:59 PM, "Aaron Kimball"
>>>>>>>>>>>> <aa...@cloudera.com> wrote:
>>>>>>>>>>>> 
>>>>>>>>>>>> Hi all,
>>>>>>>>>>>> 
>>>>>>>>>>>> I've been trying to use the Hive JDBC client today with some
>>>>>>>>>>>> frustration.
>>>>>>>>>>>> My
>>>>>>>>>>>> goal was to execute a simple "SHOW TABLES" statement in Hive.
>>>>>>>>>>>> 
>>>>>>>>>>>> If I start the Hive server with HIVE_PORT=10000 hive --service
>>>>>>>>>>>> hiveserver,
>>>>>>>>>>>> the
>>>>>>>>>>>> following happens when I connect to
>>>>>>>>>>>> jdbc:hive://localhost:10000/default:
>>>>>>>>>>>> java.sql.SQLException: Method not supported
>>>>>>>>>>>> 
>>>>>>>>>>>> If instead I attempt to connect to jdbc:hive:// (without the
>>>>>>>>>>>> standalone
>>>>>>>>>>>> hiveserver started), I get:
>>>>>>>>>>>> java.sql.SQLException:
>>>>>>>>>>>> MetaException(message:hive.metastore.warehouse.dir
>>>>>>>>>>>> is
>>>>>>>>>>>> not set in the config or blank)
>>>>>>>>>>>> 
>>>>>>>>>>>> I'm confused where I should set the
>>>>>>>>>>>> hive.metastore.warehouse.dir
>>>>>>>>>>>> property.
>>>>>>>>>>>> I've run this from a directory containing a valid
>>>>>>>>>>>> hive-site.xml; this
>>>>>>>>>>>> directory is named "conf/", so I also tried running my program in
>>>>>>>>>>>> that
>>>>>>>>>>>> directory's parent, thinking it may look for
>>>>>>>>>>>> conf/hive-default.xml
>>>>>>>>>>>> and
>>>>>>>>>>>> conf/hive-site.xml.  How do I set the configuration
>>>>>>>>>>>> files that will
>>>>>>>>>>>> be
>>>>>>>>>>>> loaded
>>>>>>>>>>>> inside the call to DriverManager.getConnection()? And in
>>>>>>>>>>>> the case of
>>>>>>>>>>>> the
>>>>>>>>>>>> standalone server, does anyone have any insight into why I'd get
>>>>>>>>>>>> "method >>
>>>>>>>>>>>> not
>>>>>>>>>>>> supported" ?
>>>>>>>>>>>> 
>>>>>>>>>>>> FWIW, the program I ran was invoked via 'hadoop jar >>>>>>>>>>>>
>>>>>>>>>>>> ...'; I don't
>>>>>>>>>>>> know if
>>>>>>>>>>>> launching a program in this way would mess up Hive's
>>>>>>>>>>>> config paths,
>>>>>>>>>>>> etc.
>>>>>>>>>>>> This
>>>>>>>>>>>> is Hadoop 0.18.3, Hive 0.3.0.
>>>>>>>>>>>> 
>>>>>>>>>>>> Thanks,
>>>>>>>>>>>> - Aaron
>>>>>>>>>>>> 
>>>>>>>>>>>> 
>>>>>>>>>>>> 
>>>>>>>>>> 
>>>>>> 
>>>> 
>>>> 
>> 
> 


Re: hive jdbc client usage?

Posted by Bill Graham <bi...@gmail.com>.
Thanks guys, this works now for me now. The JDOFatalInternalException was
because the Derby jar was missing from my classpath.  Prasad, there wasn't a
more detailed stack trace, only what I posted. I'd say that better error
messaging could help here, but it seems like a hard exception to catch and
correctly diagnose.

Either way, here's a script to run the client example in embedded mode that
might be helpful to others. Should I add it to the wiki?

#!/bin/bash
HADOOP_HOME=/path/to/hadoop
HIVE_HOME=/path/to/dist

echo -e '1\x01foo' > /tmp/a.txt
echo -e '2\x01bar' >> /tmp/a.txt

HADOOP_CORE=`ls $HADOOP_HOME/hadoop-*-core.jar`
CLASSPATH=.:$HADOOP_CORE:\$HIVE_HOME/conf

for i in ${HIVE_HOME}/lib/*.jar ; do
    CLASSPATH=$CLASSPATH:$i
done

java -cp $CLASSPATH HiveJdbcClient



On Wed, Jun 3, 2009 at 1:34 PM, Raghu Murthy <rm...@facebook.com> wrote:

> It should be a path problem. I am not sure which exact jar you are missing,
> but by adding  all the jars in $HIVE_HOME/lib, $HIVE_HOME/conf, and
> $HADOOP_HOME/hadoop-*-core.jar (need to set HADOOP_HOME) to the classpath I
> was able to run the program in embedded mode.
>
>
> On 6/3/09 1:05 PM, "Prasad Chakka" <pc...@facebook.com> wrote:
>
> > Do you have the jdbc driver in tha path? If you can paste the full chain
> of
> > exceptions, I may be able to tell you exactly what is missing.
> >
> > Thanks,
> > Prasad
> >
> >
> >
> > From: Bill Graham <bi...@gmail.com>
> > Reply-To: <hi...@hadoop.apache.org>, <bi...@gmail.com>
> > Date: Wed, 3 Jun 2009 12:57:38 -0700
> > To: <hi...@hadoop.apache.org>
> > Subject: Re: hive jdbc client usage?
> >
> > This worked for me against a remote Hive server fyi, but I needed two
> more
> > jars:
> >
> > hive_exec.jar
> > log4j-1.2.15
> >
> > I then tried running it in embedded mode connecting to "jdbc:hive://" and
> got
> > the same hive.metastore.warehouse.dir exception as Aaron. This seems to
> be
> > because the hive configs aren't being properly loaded. Adding
> $HIVE_HOME/conf
> > to the classpath remedied that exception, but then I needed a few more
> jars
> > from the Hive dist:
> >
> > jdo2-api-2.1.jar
> > jpox-core-1.2.2.jar
> > jpox-rdbms-1.2.2.jar
> >
> > This resulted in the following error:
> >
> > 09/06/03 12:45:53 INFO JPOX.Persistence:
> > ===========================================================
> > Exception in thread "main" java.sql.SQLException:
> > javax.jdo.JDOFatalInternalException: Error creating transactional
> connection
> > factory
> > NestedThrowables:
> > java.lang.reflect.InvocationTargetException
> >         at
> org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:71)
> >         at java.sql.DriverManager.getConnection(DriverManager.java:582)
> >         at java.sql.DriverManager.getConnection(DriverManager.java:185)
> >         at HiveJdbcClient.main(HiveJdbcClient.java:22)
> >
> > Any ideas re how to fix this? Should it be possible to run the client in
> > embedded mode in this way?
> >
> > thanks,
> > Bill
> >
> > On Wed, Jun 3, 2009 at 11:48 AM, Raghu Murthy <rm...@facebook.com>
> wrote:
> >> I have created a standalone program for hive jdbc at:
> >>
> http://wiki.apache.org/hadoop/Hive/HiveClient#head-5b27b3a8f9f322945734f470d
> >> 1ae58f8afeaa0b4
> >>
> <
> http://wiki.apache.org/hadoop/Hive/HiveClient#head-5b27b3a8f9f322945734f470d
> >>
> %
> >> 0A1ae58f8afeaa0b4>
> >>
> >> Let me know if it works.
> >>
> >> raghu
> >>
> >> On 5/14/09 1:33 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:
> >>
> >>>> I started the server with
> >>>> aaron@jargon:~/src/ext/svn/hive-0.3.0/build/dist/bin$ HIVE_PORT=10000
> >>> ./hive
> >>>> --service hiveserver
> >>>> It appeared to start correctly.
> >>>>
> >>>> Then ran the test using the ant command-line you gave me. It connected
> to
> >>>> >> the
> >>>> hiveserver (some output appeared there, including many SqlExceptions
> >>> regarding
> >>>> indices that already exist), but the test fails:
> >>>>
> >>>> test:
> >>>>     [junit] Running org.apache.hadoop.hive.jdbc.TestJdbcDriver
> >>>>     [junit] Hive history
> >>>>
> >>>
> file=/home/aaron/src/ext/svn/hive-0.3.0/jdbc/../build/ql/tmp/hive_job_log_aa
> >>> ro
> >>>> n_200905141328_716976076.txt
> >>>>     [junit] Tests run: 1, Failures: 1, Errors: 0, Time elapsed: 1.836
> sec
> >>>>     [junit] Test org.apache.hadoop.hive.jdbc.TestJdbcDriver FAILED
> >>>>
> >>>> BUILD FAILED
> >>>> /home/aaron/src/ext/svn/hive-0.3.0/build.xml:166: The following error
> >>> occurred
> >>>> while executing this line:
> >>>> /home/aaron/src/ext/svn/hive-0.3.0/build-common.xml:269: Tests failed!
> >>>>
> >>>> I've attached the test log.
> >>>> - Aaron
> >>>>
> >>>> On Wed, May 13, 2009 at 8:30 PM, Raghu Murthy <rm...@facebook.com>
> wrote:
> >>>>>> Ok, one more test. Can you apply the attached patch and then run the
> >>>>>> following?
> >>>>>>
> >>>>>> 1. rebuild
> >>>>>> 2. from dist/bin, run hive server on localhost port 10000
> >>>>>> 3. from trunk, run ant test -Dtestcase=TestJdbcDriver
> -Dstandalone=true
> >>>>>>
> >>>>>> Does the test succeed?
> >>>>>>
> >>>>>> On 5/13/09 4:13 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:
> >>>>>>
> >>>>>>>>>> I can in fact run the hive cli. I created a table named foo and
> can
> >>>>>>>> describe
> >>>>>>>>>> it, select from it, etc.
> >>>>>>>>>>
> >>>>>>>>>> I also tried to run 'SELECT * FROM foo' via JDBC and that failed
> as
> >>>>>>>>>> >>>>> well.
> >>>>>>>>>> - Aaron
> >>>>>>>>>>
> >>>>>>>>>> On Wed, May 13, 2009 at 4:02 PM, Raghu Murthy <
> rmurthy@facebook.com>
> >>>>>>>>>> >>>>> wrote:
> >>>>>>>>>>>> Are you able to run the hive cli from the same installation?
> >>>>>>>> There are
> >>>>>>>>>>>> currently some issues while running metadata-only calls (show,
> >>>>>>>> describe)
> >>>>>>>>>>>> via
> >>>>>>>>>>>> JDBC. Regular queries should be fine though.
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>> On 5/13/09 3:59 PM, "Aaron Kimball" <aa...@cloudera.com>
> wrote:
> >>>>>>>>>>>>
> >>>>>>>>>>>> Hi all,
> >>>>>>>>>>>>
> >>>>>>>>>>>> I've been trying to use the Hive JDBC client today with some
> >>>>>>>>>>>> frustration.
> >>>>>>>>>>>> My
> >>>>>>>>>>>> goal was to execute a simple "SHOW TABLES" statement in Hive.
> >>>>>>>>>>>>
> >>>>>>>>>>>> If I start the Hive server with HIVE_PORT=10000 hive --service
> >>>>>>>>>>>> hiveserver,
> >>>>>>>>>>>> the
> >>>>>>>>>>>> following happens when I connect to
> >>>>>>>>>>>> jdbc:hive://localhost:10000/default:
> >>>>>>>>>>>> java.sql.SQLException: Method not supported
> >>>>>>>>>>>>
> >>>>>>>>>>>> If instead I attempt to connect to jdbc:hive:// (without the
> >>>>>>>>>>>> standalone
> >>>>>>>>>>>> hiveserver started), I get:
> >>>>>>>>>>>> java.sql.SQLException:
> >>>>>>>>>>>> MetaException(message:hive.metastore.warehouse.dir
> >>>>>>>>>>>> is
> >>>>>>>>>>>> not set in the config or blank)
> >>>>>>>>>>>>
> >>>>>>>>>>>> I'm confused where I should set the
> >>>>>>>>>>>> hive.metastore.warehouse.dir
> >>>>>>>>>>>> property.
> >>>>>>>>>>>> I've run this from a directory containing a valid
> >>>>>>>>>>>> hive-site.xml; this
> >>>>>>>>>>>> directory is named "conf/", so I also tried running my program
> in
> >>>>>>>>>>>> that
> >>>>>>>>>>>> directory's parent, thinking it may look for
> >>>>>>>>>>>> conf/hive-default.xml
> >>>>>>>>>>>> and
> >>>>>>>>>>>> conf/hive-site.xml.  How do I set the configuration files that
> will
> >>>>>>>>>>>> be
> >>>>>>>>>>>> loaded
> >>>>>>>>>>>> inside the call to DriverManager.getConnection()? And in
> >>>>>>>>>>>> >>>>>>>>>>> the case of
> >>>>>>>>>>>> the
> >>>>>>>>>>>> standalone server, does anyone have any insight into why I'd
> get
> >>>>>>>>>>>> "method >>
> >>>>>>>>>>>> not
> >>>>>>>>>>>> supported" ?
> >>>>>>>>>>>>
> >>>>>>>>>>>> FWIW, the program I ran was invoked via 'hadoop jar ...'; I
> don't
> >>>>>>>>>>>> know if
> >>>>>>>>>>>> launching a program in this way would mess up Hive's
> >>>>>>>>>>>> config paths,
> >>>>>>>>>>>> etc.
> >>>>>>>>>>>> This
> >>>>>>>>>>>> is Hadoop 0.18.3, Hive 0.3.0.
> >>>>>>>>>>>>
> >>>>>>>>>>>> Thanks,
> >>>>>>>>>>>> - Aaron
> >>>>>>>>>>>>
> >>>>>>>>>>
> >>>>>>
> >>>>
> >>
> >
> >
>
>

Re: hive jdbc client usage?

Posted by Raghu Murthy <rm...@facebook.com>.
It should be a path problem. I am not sure which exact jar you are missing,
but by adding  all the jars in $HIVE_HOME/lib, $HIVE_HOME/conf, and
$HADOOP_HOME/hadoop-*-core.jar (need to set HADOOP_HOME) to the classpath I
was able to run the program in embedded mode.


On 6/3/09 1:05 PM, "Prasad Chakka" <pc...@facebook.com> wrote:

> Do you have the jdbc driver in tha path? If you can paste the full chain of
> exceptions, I may be able to tell you exactly what is missing.
> 
> Thanks,
> Prasad
> 
> 
> 
> From: Bill Graham <bi...@gmail.com>
> Reply-To: <hi...@hadoop.apache.org>, <bi...@gmail.com>
> Date: Wed, 3 Jun 2009 12:57:38 -0700
> To: <hi...@hadoop.apache.org>
> Subject: Re: hive jdbc client usage?
> 
> This worked for me against a remote Hive server fyi, but I needed two more
> jars:
> 
> hive_exec.jar
> log4j-1.2.15
> 
> I then tried running it in embedded mode connecting to "jdbc:hive://" and got
> the same hive.metastore.warehouse.dir exception as Aaron. This seems to be
> because the hive configs aren't being properly loaded. Adding $HIVE_HOME/conf
> to the classpath remedied that exception, but then I needed a few more jars
> from the Hive dist:
> 
> jdo2-api-2.1.jar
> jpox-core-1.2.2.jar
> jpox-rdbms-1.2.2.jar
> 
> This resulted in the following error:
> 
> 09/06/03 12:45:53 INFO JPOX.Persistence:
> ===========================================================
> Exception in thread "main" java.sql.SQLException:
> javax.jdo.JDOFatalInternalException: Error creating transactional connection
> factory
> NestedThrowables:
> java.lang.reflect.InvocationTargetException
>         at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:71)
>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>         at HiveJdbcClient.main(HiveJdbcClient.java:22)
> 
> Any ideas re how to fix this? Should it be possible to run the client in
> embedded mode in this way?
> 
> thanks,
> Bill
> 
> On Wed, Jun 3, 2009 at 11:48 AM, Raghu Murthy <rm...@facebook.com> wrote:
>> I have created a standalone program for hive jdbc at:
>> http://wiki.apache.org/hadoop/Hive/HiveClient#head-5b27b3a8f9f322945734f470d
>> 1ae58f8afeaa0b4 
>> 
<http://wiki.apache.org/hadoop/Hive/HiveClient#head-5b27b3a8f9f322945734f470d>>
%
>> 0A1ae58f8afeaa0b4>
>> 
>> Let me know if it works.
>> 
>> raghu
>> 
>> On 5/14/09 1:33 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:
>> 
>>>> I started the server with
>>>> aaron@jargon:~/src/ext/svn/hive-0.3.0/build/dist/bin$ HIVE_PORT=10000
>>> ./hive
>>>> --service hiveserver
>>>> It appeared to start correctly.
>>>> 
>>>> Then ran the test using the ant command-line you gave me. It connected to
>>>> >> the
>>>> hiveserver (some output appeared there, including many SqlExceptions
>>> regarding
>>>> indices that already exist), but the test fails:
>>>> 
>>>> test:
>>>>     [junit] Running org.apache.hadoop.hive.jdbc.TestJdbcDriver
>>>>     [junit] Hive history
>>>> 
>>> file=/home/aaron/src/ext/svn/hive-0.3.0/jdbc/../build/ql/tmp/hive_job_log_aa
>>> ro
>>>> n_200905141328_716976076.txt
>>>>     [junit] Tests run: 1, Failures: 1, Errors: 0, Time elapsed: 1.836 sec
>>>>     [junit] Test org.apache.hadoop.hive.jdbc.TestJdbcDriver FAILED
>>>> 
>>>> BUILD FAILED
>>>> /home/aaron/src/ext/svn/hive-0.3.0/build.xml:166: The following error
>>> occurred
>>>> while executing this line:
>>>> /home/aaron/src/ext/svn/hive-0.3.0/build-common.xml:269: Tests failed!
>>>> 
>>>> I've attached the test log.
>>>> - Aaron
>>>> 
>>>> On Wed, May 13, 2009 at 8:30 PM, Raghu Murthy <rm...@facebook.com> wrote:
>>>>>> Ok, one more test. Can you apply the attached patch and then run the
>>>>>> following?
>>>>>> 
>>>>>> 1. rebuild
>>>>>> 2. from dist/bin, run hive server on localhost port 10000
>>>>>> 3. from trunk, run ant test -Dtestcase=TestJdbcDriver -Dstandalone=true
>>>>>> 
>>>>>> Does the test succeed?
>>>>>> 
>>>>>> On 5/13/09 4:13 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:
>>>>>> 
>>>>>>>>>> I can in fact run the hive cli. I created a table named foo and can
>>>>>>>> describe
>>>>>>>>>> it, select from it, etc.
>>>>>>>>>> 
>>>>>>>>>> I also tried to run 'SELECT * FROM foo' via JDBC and that failed as
>>>>>>>>>> >>>>> well.
>>>>>>>>>> - Aaron
>>>>>>>>>> 
>>>>>>>>>> On Wed, May 13, 2009 at 4:02 PM, Raghu Murthy <rm...@facebook.com>
>>>>>>>>>> >>>>> wrote:
>>>>>>>>>>>> Are you able to run the hive cli from the same installation?
>>>>>>>> There are
>>>>>>>>>>>> currently some issues while running metadata-only calls (show,
>>>>>>>> describe)
>>>>>>>>>>>> via
>>>>>>>>>>>> JDBC. Regular queries should be fine though.
>>>>>>>>>>>> 
>>>>>>>>>>>> 
>>>>>>>>>>>> On 5/13/09 3:59 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:
>>>>>>>>>>>> 
>>>>>>>>>>>> Hi all,
>>>>>>>>>>>> 
>>>>>>>>>>>> I've been trying to use the Hive JDBC client today with some
>>>>>>>>>>>> frustration.
>>>>>>>>>>>> My
>>>>>>>>>>>> goal was to execute a simple "SHOW TABLES" statement in Hive.
>>>>>>>>>>>> 
>>>>>>>>>>>> If I start the Hive server with HIVE_PORT=10000 hive --service
>>>>>>>>>>>> hiveserver,
>>>>>>>>>>>> the
>>>>>>>>>>>> following happens when I connect to
>>>>>>>>>>>> jdbc:hive://localhost:10000/default:
>>>>>>>>>>>> java.sql.SQLException: Method not supported
>>>>>>>>>>>> 
>>>>>>>>>>>> If instead I attempt to connect to jdbc:hive:// (without the
>>>>>>>>>>>> standalone
>>>>>>>>>>>> hiveserver started), I get:
>>>>>>>>>>>> java.sql.SQLException:
>>>>>>>>>>>> MetaException(message:hive.metastore.warehouse.dir
>>>>>>>>>>>> is
>>>>>>>>>>>> not set in the config or blank)
>>>>>>>>>>>> 
>>>>>>>>>>>> I'm confused where I should set the
>>>>>>>>>>>> hive.metastore.warehouse.dir
>>>>>>>>>>>> property.
>>>>>>>>>>>> I've run this from a directory containing a valid
>>>>>>>>>>>> hive-site.xml; this
>>>>>>>>>>>> directory is named "conf/", so I also tried running my program in
>>>>>>>>>>>> that
>>>>>>>>>>>> directory's parent, thinking it may look for
>>>>>>>>>>>> conf/hive-default.xml
>>>>>>>>>>>> and
>>>>>>>>>>>> conf/hive-site.xml.  How do I set the configuration files that will
>>>>>>>>>>>> be
>>>>>>>>>>>> loaded
>>>>>>>>>>>> inside the call to DriverManager.getConnection()? And in
>>>>>>>>>>>> >>>>>>>>>>> the case of
>>>>>>>>>>>> the
>>>>>>>>>>>> standalone server, does anyone have any insight into why I'd get
>>>>>>>>>>>> "method >>
>>>>>>>>>>>> not
>>>>>>>>>>>> supported" ?
>>>>>>>>>>>> 
>>>>>>>>>>>> FWIW, the program I ran was invoked via 'hadoop jar ...'; I don't
>>>>>>>>>>>> know if
>>>>>>>>>>>> launching a program in this way would mess up Hive's
>>>>>>>>>>>> config paths,
>>>>>>>>>>>> etc.
>>>>>>>>>>>> This
>>>>>>>>>>>> is Hadoop 0.18.3, Hive 0.3.0.
>>>>>>>>>>>> 
>>>>>>>>>>>> Thanks,
>>>>>>>>>>>> - Aaron
>>>>>>>>>>>> 
>>>>>>>>>> 
>>>>>> 
>>>> 
>> 
> 
> 


Re: hive jdbc client usage?

Posted by Prasad Chakka <pc...@facebook.com>.
Do you have the jdbc driver in tha path? If you can paste the full chain of exceptions, I may be able to tell you exactly what is missing.

Thanks,
Prasad


________________________________
From: Bill Graham <bi...@gmail.com>
Reply-To: <hi...@hadoop.apache.org>, <bi...@gmail.com>
Date: Wed, 3 Jun 2009 12:57:38 -0700
To: <hi...@hadoop.apache.org>
Subject: Re: hive jdbc client usage?

This worked for me against a remote Hive server fyi, but I needed two more jars:

hive_exec.jar
log4j-1.2.15

I then tried running it in embedded mode connecting to "jdbc:hive://" and got the same hive.metastore.warehouse.dir exception as Aaron. This seems to be because the hive configs aren't being properly loaded. Adding $HIVE_HOME/conf to the classpath remedied that exception, but then I needed a few more jars from the Hive dist:

jdo2-api-2.1.jar
jpox-core-1.2.2.jar
jpox-rdbms-1.2.2.jar

This resulted in the following error:

09/06/03 12:45:53 INFO JPOX.Persistence: ===========================================================
Exception in thread "main" java.sql.SQLException: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
        at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:71)
        at java.sql.DriverManager.getConnection(DriverManager.java:582)
        at java.sql.DriverManager.getConnection(DriverManager.java:185)
        at HiveJdbcClient.main(HiveJdbcClient.java:22)

Any ideas re how to fix this? Should it be possible to run the client in embedded mode in this way?

thanks,
Bill

On Wed, Jun 3, 2009 at 11:48 AM, Raghu Murthy <rm...@facebook.com> wrote:
I have created a standalone program for hive jdbc at:
http://wiki.apache.org/hadoop/Hive/HiveClient#head-5b27b3a8f9f322945734f470d
1ae58f8afeaa0b4 <http://wiki.apache.org/hadoop/Hive/HiveClient#head-5b27b3a8f9f322945734f470d%0A1ae58f8afeaa0b4>

Let me know if it works.

raghu

On 5/14/09 1:33 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:

> I started the server with
> aaron@jargon:~/src/ext/svn/hive-0.3.0/build/dist/bin$ HIVE_PORT=10000 ./hive
> --service hiveserver
> It appeared to start correctly.
>
> Then ran the test using the ant command-line you gave me. It connected to the
> hiveserver (some output appeared there, including many SqlExceptions regarding
> indices that already exist), but the test fails:
>
> test:
>     [junit] Running org.apache.hadoop.hive.jdbc.TestJdbcDriver
>     [junit] Hive history
> file=/home/aaron/src/ext/svn/hive-0.3.0/jdbc/../build/ql/tmp/hive_job_log_aaro
> n_200905141328_716976076.txt
>     [junit] Tests run: 1, Failures: 1, Errors: 0, Time elapsed: 1.836 sec
>     [junit] Test org.apache.hadoop.hive.jdbc.TestJdbcDriver FAILED
>
> BUILD FAILED
> /home/aaron/src/ext/svn/hive-0.3.0/build.xml:166: The following error occurred
> while executing this line:
> /home/aaron/src/ext/svn/hive-0.3.0/build-common.xml:269: Tests failed!
>
> I've attached the test log.
> - Aaron
>
> On Wed, May 13, 2009 at 8:30 PM, Raghu Murthy <rm...@facebook.com> wrote:
>> Ok, one more test. Can you apply the attached patch and then run the
>> following?
>>
>> 1. rebuild
>> 2. from dist/bin, run hive server on localhost port 10000
>> 3. from trunk, run ant test -Dtestcase=TestJdbcDriver -Dstandalone=true
>>
>> Does the test succeed?
>>
>> On 5/13/09 4:13 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:
>>
>>>> I can in fact run the hive cli. I created a table named foo and can
>>> describe
>>>> it, select from it, etc.
>>>>
>>>> I also tried to run 'SELECT * FROM foo' via JDBC and that failed as well.
>>>> - Aaron
>>>>
>>>> On Wed, May 13, 2009 at 4:02 PM, Raghu Murthy <rm...@facebook.com> wrote:
>>>>>> Are you able to run the hive cli from the same installation? There are
>>>>>> currently some issues while running metadata-only calls (show, describe)
>>>>>> >>> via
>>>>>> JDBC. Regular queries should be fine though.
>>>>>>
>>>>>>
>>>>>> On 5/13/09 3:59 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:
>>>>>>
>>>>>>>>>> Hi all,
>>>>>>>>>>
>>>>>>>>>> I've been trying to use the Hive JDBC client today with some
>>>>>> frustration.
>>>>>>>>>>>> My
>>>>>>>>>> goal was to execute a simple "SHOW TABLES" statement in Hive.
>>>>>>>>>>
>>>>>>>>>> If I start the Hive server with HIVE_PORT=10000 hive --service
>>>>>> hiveserver,
>>>>>>>>>>>> the
>>>>>>>>>> following happens when I connect to
>>>>>> jdbc:hive://localhost:10000/default:
>>>>>>>>>> java.sql.SQLException: Method not supported
>>>>>>>>>>
>>>>>>>>>> If instead I attempt to connect to jdbc:hive:// (without the
>>>>>> standalone
>>>>>>>>>> hiveserver started), I get:
>>>>>>>>>> java.sql.SQLException:
>>>>>> MetaException(message:hive.metastore.warehouse.dir
>>>>>>>>>>>> is
>>>>>>>>>> not set in the config or blank)
>>>>>>>>>>
>>>>>>>>>> I'm confused where I should set the hive.metastore.warehouse.dir
>>>>>> property.
>>>>>>>>>> I've run this from a directory containing a valid hive-site.xml; this
>>>>>>>>>> directory is named "conf/", so I also tried running my program in
>>>>>>>>>> that
>>>>>>>>>> directory's parent, thinking it may look for conf/hive-default.xml
>>>>>>>>>> and
>>>>>>>>>> conf/hive-site.xml.  How do I set the configuration files that will
>>>>>>>>>> be
>>>>>>>> loaded
>>>>>>>>>> inside the call to DriverManager.getConnection()? And in the case of
>>>>>>>>>> the
>>>>>>>>>> standalone server, does anyone have any insight into why I'd get
>>>>>> "method >>
>>>>>>>>>> not
>>>>>>>>>> supported" ?
>>>>>>>>>>
>>>>>>>>>> FWIW, the program I ran was invoked via 'hadoop jar ...'; I don't
>>>>>> know if
>>>>>>>>>> launching a program in this way would mess up Hive's config paths,
>>>>>>>>>> etc.
>>>>>>>> This
>>>>>>>>>> is Hadoop 0.18.3, Hive 0.3.0.
>>>>>>>>>>
>>>>>>>>>> Thanks,
>>>>>>>>>> - Aaron
>>>>>>
>>>>
>>
>




Re: hive jdbc client usage?

Posted by Bill Graham <bi...@gmail.com>.
This worked for me against a remote Hive server fyi, but I needed two more
jars:

hive_exec.jar
log4j-1.2.15

I then tried running it in embedded mode connecting to "jdbc:hive://" and
got the same hive.metastore.warehouse.dir exception as Aaron. This seems to
be because the hive configs aren't being properly loaded. Adding
$HIVE_HOME/conf to the classpath remedied that exception, but then I needed
a few more jars from the Hive dist:

jdo2-api-2.1.jar
jpox-core-1.2.2.jar
jpox-rdbms-1.2.2.jar

This resulted in the following error:

09/06/03 12:45:53 INFO JPOX.Persistence:
===========================================================
Exception in thread "main" java.sql.SQLException:
javax.jdo.JDOFatalInternalException: Error creating transactional connection
factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
        at
org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:71)
        at java.sql.DriverManager.getConnection(DriverManager.java:582)
        at java.sql.DriverManager.getConnection(DriverManager.java:185)
        at HiveJdbcClient.main(HiveJdbcClient.java:22)

Any ideas re how to fix this? Should it be possible to run the client in
embedded mode in this way?

thanks,
Bill

On Wed, Jun 3, 2009 at 11:48 AM, Raghu Murthy <rm...@facebook.com> wrote:

> I have created a standalone program for hive jdbc at:
>
> http://wiki.apache.org/hadoop/Hive/HiveClient#head-5b27b3a8f9f322945734f470d
> 1ae58f8afeaa0b4<http://wiki.apache.org/hadoop/Hive/HiveClient#head-5b27b3a8f9f322945734f470d%0A1ae58f8afeaa0b4>
>
> Let me know if it works.
>
> raghu
>
> On 5/14/09 1:33 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:
>
> > I started the server with
> > aaron@jargon:~/src/ext/svn/hive-0.3.0/build/dist/bin$ HIVE_PORT=10000
> ./hive
> > --service hiveserver
> > It appeared to start correctly.
> >
> > Then ran the test using the ant command-line you gave me. It connected to
> the
> > hiveserver (some output appeared there, including many SqlExceptions
> regarding
> > indices that already exist), but the test fails:
> >
> > test:
> >     [junit] Running org.apache.hadoop.hive.jdbc.TestJdbcDriver
> >     [junit] Hive history
> >
> file=/home/aaron/src/ext/svn/hive-0.3.0/jdbc/../build/ql/tmp/hive_job_log_aaro
> > n_200905141328_716976076.txt
> >     [junit] Tests run: 1, Failures: 1, Errors: 0, Time elapsed: 1.836 sec
> >     [junit] Test org.apache.hadoop.hive.jdbc.TestJdbcDriver FAILED
> >
> > BUILD FAILED
> > /home/aaron/src/ext/svn/hive-0.3.0/build.xml:166: The following error
> occurred
> > while executing this line:
> > /home/aaron/src/ext/svn/hive-0.3.0/build-common.xml:269: Tests failed!
> >
> > I've attached the test log.
> > - Aaron
> >
> > On Wed, May 13, 2009 at 8:30 PM, Raghu Murthy <rm...@facebook.com>
> wrote:
> >> Ok, one more test. Can you apply the attached patch and then run the
> >> following?
> >>
> >> 1. rebuild
> >> 2. from dist/bin, run hive server on localhost port 10000
> >> 3. from trunk, run ant test -Dtestcase=TestJdbcDriver -Dstandalone=true
> >>
> >> Does the test succeed?
> >>
> >> On 5/13/09 4:13 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:
> >>
> >>>> I can in fact run the hive cli. I created a table named foo and can
> >>> describe
> >>>> it, select from it, etc.
> >>>>
> >>>> I also tried to run 'SELECT * FROM foo' via JDBC and that failed as
> well.
> >>>> - Aaron
> >>>>
> >>>> On Wed, May 13, 2009 at 4:02 PM, Raghu Murthy <rm...@facebook.com>
> wrote:
> >>>>>> Are you able to run the hive cli from the same installation? There
> are
> >>>>>> currently some issues while running metadata-only calls (show,
> describe)
> >>>>>> >>> via
> >>>>>> JDBC. Regular queries should be fine though.
> >>>>>>
> >>>>>>
> >>>>>> On 5/13/09 3:59 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:
> >>>>>>
> >>>>>>>>>> Hi all,
> >>>>>>>>>>
> >>>>>>>>>> I've been trying to use the Hive JDBC client today with some
> >>>>>> frustration.
> >>>>>>>>>>>> My
> >>>>>>>>>> goal was to execute a simple "SHOW TABLES" statement in Hive.
> >>>>>>>>>>
> >>>>>>>>>> If I start the Hive server with HIVE_PORT=10000 hive --service
> >>>>>> hiveserver,
> >>>>>>>>>>>> the
> >>>>>>>>>> following happens when I connect to
> >>>>>> jdbc:hive://localhost:10000/default:
> >>>>>>>>>> java.sql.SQLException: Method not supported
> >>>>>>>>>>
> >>>>>>>>>> If instead I attempt to connect to jdbc:hive:// (without the
> >>>>>> standalone
> >>>>>>>>>> hiveserver started), I get:
> >>>>>>>>>> java.sql.SQLException:
> >>>>>> MetaException(message:hive.metastore.warehouse.dir
> >>>>>>>>>>>> is
> >>>>>>>>>> not set in the config or blank)
> >>>>>>>>>>
> >>>>>>>>>> I'm confused where I should set the hive.metastore.warehouse.dir
> >>>>>> property.
> >>>>>>>>>> I've run this from a directory containing a valid hive-site.xml;
> this
> >>>>>>>>>> directory is named "conf/", so I also tried running my program
> in
> >>>>>>>>>> that
> >>>>>>>>>> directory's parent, thinking it may look for
> conf/hive-default.xml
> >>>>>>>>>> and
> >>>>>>>>>> conf/hive-site.xml.  How do I set the configuration files that
> will
> >>>>>>>>>> be
> >>>>>>>> loaded
> >>>>>>>>>> inside the call to DriverManager.getConnection()? And in the
> case of
> >>>>>>>>>> the
> >>>>>>>>>> standalone server, does anyone have any insight into why I'd get
> >>>>>> "method >>
> >>>>>>>>>> not
> >>>>>>>>>> supported" ?
> >>>>>>>>>>
> >>>>>>>>>> FWIW, the program I ran was invoked via 'hadoop jar ...'; I
> don't
> >>>>>> know if
> >>>>>>>>>> launching a program in this way would mess up Hive's config
> paths,
> >>>>>>>>>> etc.
> >>>>>>>> This
> >>>>>>>>>> is Hadoop 0.18.3, Hive 0.3.0.
> >>>>>>>>>>
> >>>>>>>>>> Thanks,
> >>>>>>>>>> - Aaron
> >>>>>>
> >>>>
> >>
> >
>
>

Re: Can Hive recognize commented out line in data files while loading?

Posted by Raghu Murthy <rm...@facebook.com>.
I guess the question was about loading the data. In the load command, we
currently just copy over the data without parsing it (via the CopyTask).

Even though we could choose to neglect commented rows at query time (via
SerDes), its probably more efficient to do it once while loading.

In addition, it would be good to provide more features to the load command
like verifying schema and loading to multiple partitions based on columns in
the rows. Can you file a jira for this? I can take a shot at implementing
these features.

On 5/20/09 11:26 PM, "Zheng Shao" <zs...@gmail.com> wrote:

> The Hive internal Serdes do not allow this format yet. We will need to change
> Hive code to make that happen.
> Specifically, it's the LazySimpleSerDe class.
> 
> Zheng
> 
> On Wed, May 20, 2009 at 11:05 PM, Manhee Jo <jo...@nttdocomo.com> wrote:
>> Is it possible for hive to recognize commented rows in a file when it loads a
>> csv file?
>> 
>> For example, say contents of test.csv is,
>> 
>> #123
>> #Red, Brown, Black, Blue
>> 3, AB, 5, 3
>> 2, AA, 1, 4
>> ...
>> 
>> In hive, how to ignore first two lines while loading?
>> 
>> 
>> Thanks,
>> Manhee 
>> 
> 
> 


Re: Can Hive recognize commented out line in data files while loading?

Posted by Zheng Shao <zs...@gmail.com>.
The Hive internal Serdes do not allow this format yet. We will need to
change Hive code to make that happen.
Specifically, it's the LazySimpleSerDe class.

Zheng

On Wed, May 20, 2009 at 11:05 PM, Manhee Jo <jo...@nttdocomo.com> wrote:

> Is it possible for hive to recognize commented rows in a file when it loads
> a csv file?
>
> For example, say contents of test.csv is,
>
> #123
> #Red, Brown, Black, Blue
> 3, AB, 5, 3
> 2, AA, 1, 4
> ...
>
> In hive, how to ignore first two lines while loading?
>
>
> Thanks,
> Manhee
>
>


-- 
Yours,
Zheng

Re: Can Hive recognize commented out line in data files while loading?

Posted by Prasad Chakka <pc...@facebook.com>.
Hive doesn't do any transformation while loading, atleast not yet. You can load data into a temporary field and then do a 'insert overwrite <tab> select * from <tmp_tab> where <predicate that filters out comments>'


________________________________
From: Manhee Jo <jo...@nttdocomo.com>
Reply-To: <hi...@hadoop.apache.org>
Date: Wed, 20 May 2009 23:05:03 -0700
To: <hi...@hadoop.apache.org>
Subject: Can Hive recognize commented out line in data files while loading?

Is it possible for hive to recognize commented rows in a file when it loads
a csv file?

For example, say contents of test.csv is,

#123
#Red, Brown, Black, Blue
3, AB, 5, 3
2, AA, 1, 4
...

In hive, how to ignore first two lines while loading?


Thanks,
Manhee




Can Hive recognize commented out line in data files while loading?

Posted by Manhee Jo <jo...@nttdocomo.com>.
Is it possible for hive to recognize commented rows in a file when it loads 
a csv file?

For example, say contents of test.csv is,

#123
#Red, Brown, Black, Blue
3, AB, 5, 3
2, AA, 1, 4
...

In hive, how to ignore first two lines while loading?


Thanks,
Manhee 



Re: hive hwi

Posted by Edward Capriolo <ed...@gmail.com>.
I have a RedHat/CentOS system. I find the best way is to add scripts
to /etc/profile.d

/etc/profile.d/hive.sh
---
PATH=$PATH:/opt/hive/bin
ANT_LIB=/opt/ant/lib

export PATH
export ANT_LIB
---
I did something like that here.

http://wiki.apache.org/hadoop/HiveDerbyServerMode

On Wed, May 20, 2009 at 9:04 PM, Manhee Jo <jo...@nttdocomo.com> wrote:
> Thank you Edward,
> It went well at last by copying the *.jar files to hive/lib.
> But it's really strange. I have all the directories in my path environment.
> In addition, I've tried many times to set it using my .tcshrc files,
> hadoop-env.sh etc.
> Excuse me for a naive question, but which is the best way to set path for
> hive and hadoop?
>
>
> Thanks,
> Manhee
>
> ----- Original Message ----- From: "Edward Capriolo" <ed...@gmail.com>
> To: <hi...@hadoop.apache.org>
> Sent: Tuesday, May 19, 2009 11:32 PM
> Subject: Re: hive hwi
>
>
>> THAT is the ANT_LIB path. Your ANT_LIB is not right. As an alternative
>> you can copy the files from ${ANT_HOME}/lib/*.jar to hive/lib
>>
>> But if you set ant lib correctly you should not need to copy.
>>
>> On Tue, May 19, 2009 at 4:05 AM, Manhee Jo <jo...@nttdocomo.com> wrote:
>>>
>>> Thank you.
>>>
>>> % ls $HIVE_HOME/lib/hive_hwi.war
>>> /usr/local/hive/build/dist/lib/hive_hwi.war
>>>
>>> It didn't work, but full path worked. Any advice?
>>> Now I've reached where Arijit was.
>>>
>>>
>>> http://mail-archives.apache.org/mod_mbox/hadoop-hive-user/200903.mbox/%3Ccbbf4b570903051044p75b81d5cx370b230d4a686dcd@mail.gmail.com%3E
>>>
>>> Any further help would be very appreciated.
>>>
>>>
>>> Thanks,
>>> Manhee
>>>
>>>
>>> ----- Original Message ----- From: "Edward Capriolo"
>>> <ed...@gmail.com>
>>> To: <hi...@hadoop.apache.org>
>>> Sent: Tuesday, May 19, 2009 1:16 PM
>>> Subject: Re: hive hwi
>>>
>>>
>>>> What does '${HIVE_HOME}/lib/hive_hwi.war' evaluate to? You can try
>>>> specifying the full path in the hive-site.xml file.
>>>>
>>>> On Tue, May 19, 2009 at 12:05 AM, Manhee Jo <jo...@nttdocomo.com> wrote:
>>>>>
>>>>> Thank you Edward,
>>>>> I've used apache-ant to build hive that I checked out from trunk.
>>>>> ant version is
>>>>>
>>>>> vm2:hive 22 % ant -version
>>>>> Apache Ant version 1.7.1 compiled on November 10 2008
>>>>>
>>>>> vm2:hive 25 % echo $ANT_LIB
>>>>> /usr/share/ant/lib:/usr/local/hive/build/dist/lib
>>>>>
>>>>> /usr/share/ant/lib has ant-*.jar
>>>>> /usr/local/hive/build/dist/lib has hive_hwi.jar and hive_hwi.war as
>>>>> well
>>>>> as
>>>>> other jar files.
>>>>>
>>>>> But still see the same errors while I was running Hadoop in pseudo
>>>>> distributed mode.
>>>>> So I need your help.
>>>>>
>>>>> Thanks,
>>>>> Manhee
>>>>>
>>>>>
>>>>> ----- Original Message ----- From: "Edward Capriolo"
>>>>> <ed...@gmail.com>
>>>>> To: <hi...@hadoop.apache.org>
>>>>> Sent: Tuesday, May 19, 2009 12:43 AM
>>>>> Subject: Re: hive hwi
>>>>>
>>>>>
>>>>> On Mon, May 18, 2009 at 4:33 AM, Manhee Jo <jo...@nttdocomo.com> wrote:
>>>>>>
>>>>>> Hi all. So, is this problem solved at last?
>>>>>> In my environment, I have hive_hwi.war in ${HIVE_HOME}/lib.
>>>>>> My ANT_LIB is /opt/ant/lib, where no files exist.
>>>>>> Do I need to set ANT_LIB? If so, which directory?
>>>>>>
>>>>>> In addition, I've tried 'ant deploy'
>>>>>> But target "deploy" does not exist in the project "hive."
>>>>>> Any help would be appreciated.
>>>>>>
>>>>>>
>>>>>> Thank you.
>>>>>> Manhee
>>>>>>
>>>>>>
>>>>>>
>>>>>> 09/05/18 17:26:46 INFO hwi.HWIServer: HWI is starting up
>>>>>> 09/05/18 17:26:46 FATAL hwi.HWIServer: HWI WAR file not found at
>>>>>> ${HIVE_HOME}/lib/hive_hwi.war
>>>>>> 09/05/18 17:26:46 INFO http.HttpServer: Version Jetty/5.1.4
>>>>>> 09/05/18 17:26:46 INFO util.Credential: Checking Resource aliases
>>>>>> 09/05/18 17:26:46 WARN servlet.WebApplicationContext: Web application
>>>>>> not
>>>>>> found ${HIVE_HOME}/lib/hive_hwi.war
>>>>>> 09/05/18 17:26:46 WARN servlet.WebApplicationContext: Configuration
>>>>>> error
>>>>>> on
>>>>>> ${HIVE_HOME}/lib/hive_hwi.war
>>>>>> java.io.FileNotFoundException: ${HIVE_HOME}/lib/hive_hwi.war
>>>>>> at
>>>>>>
>>>>>>
>>>>>>
>>>>>> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
>>>>>> at
>>>>>>
>>>>>>
>>>>>>
>>>>>> org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
>>>>>> at org.mortbay.util.Container.start(Container.java:72)
>>>>>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
>>>>>> at org.mortbay.util.Container.start(Container.java:72)
>>>>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>>>>>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>> at
>>>>>>
>>>>>>
>>>>>>
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>> at
>>>>>>
>>>>>>
>>>>>>
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>>>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>>>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>>>>>> 09/05/18 17:26:46 INFO http.SocketListener: Started SocketListener on
>>>>>> 0.0.0.0:9999
>>>>>> 09/05/18 17:26:46 ERROR hwi.HWIServer: Parsing hwi.listen.port caused
>>>>>> exception
>>>>>> org.mortbay.util.MultiException[java.io.FileNotFoundException:
>>>>>> ${HIVE_HOME}/lib/hive_hwi.war]
>>>>>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:731)
>>>>>> at org.mortbay.util.Container.start(Container.java:72)
>>>>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>>>>>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>> at
>>>>>>
>>>>>>
>>>>>>
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>> at
>>>>>>
>>>>>>
>>>>>>
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>>>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>>>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>>>>>> java.io.FileNotFoundException: ${HIVE_HOME}/lib/hive_hwi.war
>>>>>> at
>>>>>>
>>>>>>
>>>>>>
>>>>>> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
>>>>>> at
>>>>>>
>>>>>>
>>>>>>
>>>>>> org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
>>>>>> at org.mortbay.util.Container.start(Container.java:72)
>>>>>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
>>>>>> at org.mortbay.util.Container.start(Container.java:72)
>>>>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>>>>>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>> at
>>>>>>
>>>>>>
>>>>>>
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>> at
>>>>>>
>>>>>>
>>>>>>
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>>>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>>>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>>>>>> java.io.IOException: Problem starting HWI server
>>>>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:97)
>>>>>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>> at
>>>>>>
>>>>>>
>>>>>>
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>> at
>>>>>>
>>>>>>
>>>>>>
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>>>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>>>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>>>>>> Caused by:
>>>>>> org.mortbay.util.MultiException[java.io.FileNotFoundException:
>>>>>> ${HIVE_HOME}/lib/hive_hwi.war]
>>>>>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:731)
>>>>>> at org.mortbay.util.Container.start(Container.java:72)
>>>>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>>>>>> ... 10 more
>>>>>>
>>>>>
>>>>> Currently you need apache ant installed to start the web interface.
>>>>> Currently Jetty requires ANT to load up the web application. Two
>>>>> Jira's are open. One to include the required Jars. The other two
>>>>> statically compile the Web Application hopefully removing the
>>>>> requirement.
>>>>>
>>>>> For now please install apache-ant and point ANT_LIB to the location of
>>>>> the ant JAR files. Let me know if you need any help.
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>>
>>>
>>
>
>
>

Re: hive hwi

Posted by Manhee Jo <jo...@nttdocomo.com>.
Thank you Edward,
It went well at last by copying the *.jar files to hive/lib.
But it's really strange. I have all the directories in my path environment.
In addition, I've tried many times to set it using my .tcshrc files, 
hadoop-env.sh etc.
Excuse me for a naive question, but which is the best way to set path for 
hive and hadoop?


Thanks,
Manhee

----- Original Message ----- 
From: "Edward Capriolo" <ed...@gmail.com>
To: <hi...@hadoop.apache.org>
Sent: Tuesday, May 19, 2009 11:32 PM
Subject: Re: hive hwi


> THAT is the ANT_LIB path. Your ANT_LIB is not right. As an alternative
> you can copy the files from ${ANT_HOME}/lib/*.jar to hive/lib
>
> But if you set ant lib correctly you should not need to copy.
>
> On Tue, May 19, 2009 at 4:05 AM, Manhee Jo <jo...@nttdocomo.com> wrote:
>> Thank you.
>>
>> % ls $HIVE_HOME/lib/hive_hwi.war
>> /usr/local/hive/build/dist/lib/hive_hwi.war
>>
>> It didn't work, but full path worked. Any advice?
>> Now I've reached where Arijit was.
>>
>> http://mail-archives.apache.org/mod_mbox/hadoop-hive-user/200903.mbox/%3Ccbbf4b570903051044p75b81d5cx370b230d4a686dcd@mail.gmail.com%3E
>>
>> Any further help would be very appreciated.
>>
>>
>> Thanks,
>> Manhee
>>
>>
>> ----- Original Message ----- From: "Edward Capriolo" 
>> <ed...@gmail.com>
>> To: <hi...@hadoop.apache.org>
>> Sent: Tuesday, May 19, 2009 1:16 PM
>> Subject: Re: hive hwi
>>
>>
>>> What does '${HIVE_HOME}/lib/hive_hwi.war' evaluate to? You can try
>>> specifying the full path in the hive-site.xml file.
>>>
>>> On Tue, May 19, 2009 at 12:05 AM, Manhee Jo <jo...@nttdocomo.com> wrote:
>>>>
>>>> Thank you Edward,
>>>> I've used apache-ant to build hive that I checked out from trunk.
>>>> ant version is
>>>>
>>>> vm2:hive 22 % ant -version
>>>> Apache Ant version 1.7.1 compiled on November 10 2008
>>>>
>>>> vm2:hive 25 % echo $ANT_LIB
>>>> /usr/share/ant/lib:/usr/local/hive/build/dist/lib
>>>>
>>>> /usr/share/ant/lib has ant-*.jar
>>>> /usr/local/hive/build/dist/lib has hive_hwi.jar and hive_hwi.war as 
>>>> well
>>>> as
>>>> other jar files.
>>>>
>>>> But still see the same errors while I was running Hadoop in pseudo
>>>> distributed mode.
>>>> So I need your help.
>>>>
>>>> Thanks,
>>>> Manhee
>>>>
>>>>
>>>> ----- Original Message ----- From: "Edward Capriolo"
>>>> <ed...@gmail.com>
>>>> To: <hi...@hadoop.apache.org>
>>>> Sent: Tuesday, May 19, 2009 12:43 AM
>>>> Subject: Re: hive hwi
>>>>
>>>>
>>>> On Mon, May 18, 2009 at 4:33 AM, Manhee Jo <jo...@nttdocomo.com> wrote:
>>>>>
>>>>> Hi all. So, is this problem solved at last?
>>>>> In my environment, I have hive_hwi.war in ${HIVE_HOME}/lib.
>>>>> My ANT_LIB is /opt/ant/lib, where no files exist.
>>>>> Do I need to set ANT_LIB? If so, which directory?
>>>>>
>>>>> In addition, I've tried 'ant deploy'
>>>>> But target "deploy" does not exist in the project "hive."
>>>>> Any help would be appreciated.
>>>>>
>>>>>
>>>>> Thank you.
>>>>> Manhee
>>>>>
>>>>>
>>>>>
>>>>> 09/05/18 17:26:46 INFO hwi.HWIServer: HWI is starting up
>>>>> 09/05/18 17:26:46 FATAL hwi.HWIServer: HWI WAR file not found at
>>>>> ${HIVE_HOME}/lib/hive_hwi.war
>>>>> 09/05/18 17:26:46 INFO http.HttpServer: Version Jetty/5.1.4
>>>>> 09/05/18 17:26:46 INFO util.Credential: Checking Resource aliases
>>>>> 09/05/18 17:26:46 WARN servlet.WebApplicationContext: Web application
>>>>> not
>>>>> found ${HIVE_HOME}/lib/hive_hwi.war
>>>>> 09/05/18 17:26:46 WARN servlet.WebApplicationContext: Configuration
>>>>> error
>>>>> on
>>>>> ${HIVE_HOME}/lib/hive_hwi.war
>>>>> java.io.FileNotFoundException: ${HIVE_HOME}/lib/hive_hwi.war
>>>>> at
>>>>>
>>>>>
>>>>> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
>>>>> at
>>>>>
>>>>>
>>>>> org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
>>>>> at org.mortbay.util.Container.start(Container.java:72)
>>>>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
>>>>> at org.mortbay.util.Container.start(Container.java:72)
>>>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>>>>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>> at
>>>>>
>>>>>
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>> at
>>>>>
>>>>>
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>>>>> 09/05/18 17:26:46 INFO http.SocketListener: Started SocketListener on
>>>>> 0.0.0.0:9999
>>>>> 09/05/18 17:26:46 ERROR hwi.HWIServer: Parsing hwi.listen.port caused
>>>>> exception
>>>>> org.mortbay.util.MultiException[java.io.FileNotFoundException:
>>>>> ${HIVE_HOME}/lib/hive_hwi.war]
>>>>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:731)
>>>>> at org.mortbay.util.Container.start(Container.java:72)
>>>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>>>>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>> at
>>>>>
>>>>>
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>> at
>>>>>
>>>>>
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>>>>> java.io.FileNotFoundException: ${HIVE_HOME}/lib/hive_hwi.war
>>>>> at
>>>>>
>>>>>
>>>>> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
>>>>> at
>>>>>
>>>>>
>>>>> org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
>>>>> at org.mortbay.util.Container.start(Container.java:72)
>>>>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
>>>>> at org.mortbay.util.Container.start(Container.java:72)
>>>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>>>>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>> at
>>>>>
>>>>>
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>> at
>>>>>
>>>>>
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>>>>> java.io.IOException: Problem starting HWI server
>>>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:97)
>>>>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>> at
>>>>>
>>>>>
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>> at
>>>>>
>>>>>
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>>>>> Caused by:
>>>>> org.mortbay.util.MultiException[java.io.FileNotFoundException:
>>>>> ${HIVE_HOME}/lib/hive_hwi.war]
>>>>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:731)
>>>>> at org.mortbay.util.Container.start(Container.java:72)
>>>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>>>>> ... 10 more
>>>>>
>>>>
>>>> Currently you need apache ant installed to start the web interface.
>>>> Currently Jetty requires ANT to load up the web application. Two
>>>> Jira's are open. One to include the required Jars. The other two
>>>> statically compile the Web Application hopefully removing the
>>>> requirement.
>>>>
>>>> For now please install apache-ant and point ANT_LIB to the location of
>>>> the ant JAR files. Let me know if you need any help.
>>>>
>>>>
>>>>
>>>
>>
>>
>>
> 



Re: hive hwi

Posted by Edward Capriolo <ed...@gmail.com>.
THAT is the ANT_LIB path. Your ANT_LIB is not right. As an alternative
you can copy the files from ${ANT_HOME}/lib/*.jar to hive/lib

But if you set ant lib correctly you should not need to copy.

On Tue, May 19, 2009 at 4:05 AM, Manhee Jo <jo...@nttdocomo.com> wrote:
> Thank you.
>
> % ls $HIVE_HOME/lib/hive_hwi.war
> /usr/local/hive/build/dist/lib/hive_hwi.war
>
> It didn't work, but full path worked. Any advice?
> Now I've reached where Arijit was.
>
> http://mail-archives.apache.org/mod_mbox/hadoop-hive-user/200903.mbox/%3Ccbbf4b570903051044p75b81d5cx370b230d4a686dcd@mail.gmail.com%3E
>
> Any further help would be very appreciated.
>
>
> Thanks,
> Manhee
>
>
> ----- Original Message ----- From: "Edward Capriolo" <ed...@gmail.com>
> To: <hi...@hadoop.apache.org>
> Sent: Tuesday, May 19, 2009 1:16 PM
> Subject: Re: hive hwi
>
>
>> What does '${HIVE_HOME}/lib/hive_hwi.war' evaluate to? You can try
>> specifying the full path in the hive-site.xml file.
>>
>> On Tue, May 19, 2009 at 12:05 AM, Manhee Jo <jo...@nttdocomo.com> wrote:
>>>
>>> Thank you Edward,
>>> I've used apache-ant to build hive that I checked out from trunk.
>>> ant version is
>>>
>>> vm2:hive 22 % ant -version
>>> Apache Ant version 1.7.1 compiled on November 10 2008
>>>
>>> vm2:hive 25 % echo $ANT_LIB
>>> /usr/share/ant/lib:/usr/local/hive/build/dist/lib
>>>
>>> /usr/share/ant/lib has ant-*.jar
>>> /usr/local/hive/build/dist/lib has hive_hwi.jar and hive_hwi.war as well
>>> as
>>> other jar files.
>>>
>>> But still see the same errors while I was running Hadoop in pseudo
>>> distributed mode.
>>> So I need your help.
>>>
>>> Thanks,
>>> Manhee
>>>
>>>
>>> ----- Original Message ----- From: "Edward Capriolo"
>>> <ed...@gmail.com>
>>> To: <hi...@hadoop.apache.org>
>>> Sent: Tuesday, May 19, 2009 12:43 AM
>>> Subject: Re: hive hwi
>>>
>>>
>>> On Mon, May 18, 2009 at 4:33 AM, Manhee Jo <jo...@nttdocomo.com> wrote:
>>>>
>>>> Hi all. So, is this problem solved at last?
>>>> In my environment, I have hive_hwi.war in ${HIVE_HOME}/lib.
>>>> My ANT_LIB is /opt/ant/lib, where no files exist.
>>>> Do I need to set ANT_LIB? If so, which directory?
>>>>
>>>> In addition, I've tried 'ant deploy'
>>>> But target "deploy" does not exist in the project "hive."
>>>> Any help would be appreciated.
>>>>
>>>>
>>>> Thank you.
>>>> Manhee
>>>>
>>>>
>>>>
>>>> 09/05/18 17:26:46 INFO hwi.HWIServer: HWI is starting up
>>>> 09/05/18 17:26:46 FATAL hwi.HWIServer: HWI WAR file not found at
>>>> ${HIVE_HOME}/lib/hive_hwi.war
>>>> 09/05/18 17:26:46 INFO http.HttpServer: Version Jetty/5.1.4
>>>> 09/05/18 17:26:46 INFO util.Credential: Checking Resource aliases
>>>> 09/05/18 17:26:46 WARN servlet.WebApplicationContext: Web application
>>>> not
>>>> found ${HIVE_HOME}/lib/hive_hwi.war
>>>> 09/05/18 17:26:46 WARN servlet.WebApplicationContext: Configuration
>>>> error
>>>> on
>>>> ${HIVE_HOME}/lib/hive_hwi.war
>>>> java.io.FileNotFoundException: ${HIVE_HOME}/lib/hive_hwi.war
>>>> at
>>>>
>>>>
>>>> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
>>>> at
>>>>
>>>>
>>>> org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
>>>> at org.mortbay.util.Container.start(Container.java:72)
>>>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
>>>> at org.mortbay.util.Container.start(Container.java:72)
>>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>>>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>>
>>>>
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>> at
>>>>
>>>>
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>>>> 09/05/18 17:26:46 INFO http.SocketListener: Started SocketListener on
>>>> 0.0.0.0:9999
>>>> 09/05/18 17:26:46 ERROR hwi.HWIServer: Parsing hwi.listen.port caused
>>>> exception
>>>> org.mortbay.util.MultiException[java.io.FileNotFoundException:
>>>> ${HIVE_HOME}/lib/hive_hwi.war]
>>>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:731)
>>>> at org.mortbay.util.Container.start(Container.java:72)
>>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>>>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>>
>>>>
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>> at
>>>>
>>>>
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>>>> java.io.FileNotFoundException: ${HIVE_HOME}/lib/hive_hwi.war
>>>> at
>>>>
>>>>
>>>> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
>>>> at
>>>>
>>>>
>>>> org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
>>>> at org.mortbay.util.Container.start(Container.java:72)
>>>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
>>>> at org.mortbay.util.Container.start(Container.java:72)
>>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>>>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>>
>>>>
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>> at
>>>>
>>>>
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>>>> java.io.IOException: Problem starting HWI server
>>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:97)
>>>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>>
>>>>
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>> at
>>>>
>>>>
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>>>> Caused by:
>>>> org.mortbay.util.MultiException[java.io.FileNotFoundException:
>>>> ${HIVE_HOME}/lib/hive_hwi.war]
>>>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:731)
>>>> at org.mortbay.util.Container.start(Container.java:72)
>>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>>>> ... 10 more
>>>>
>>>
>>> Currently you need apache ant installed to start the web interface.
>>> Currently Jetty requires ANT to load up the web application. Two
>>> Jira's are open. One to include the required Jars. The other two
>>> statically compile the Web Application hopefully removing the
>>> requirement.
>>>
>>> For now please install apache-ant and point ANT_LIB to the location of
>>> the ant JAR files. Let me know if you need any help.
>>>
>>>
>>>
>>
>
>
>

Re: hive hwi

Posted by Manhee Jo <jo...@nttdocomo.com>.
Thank you.

% ls $HIVE_HOME/lib/hive_hwi.war
/usr/local/hive/build/dist/lib/hive_hwi.war

It didn't work, but full path worked. Any advice?
Now I've reached where Arijit was.

http://mail-archives.apache.org/mod_mbox/hadoop-hive-user/200903.mbox/%3Ccbbf4b570903051044p75b81d5cx370b230d4a686dcd@mail.gmail.com%3E

Any further help would be very appreciated.


Thanks,
Manhee


----- Original Message ----- 
From: "Edward Capriolo" <ed...@gmail.com>
To: <hi...@hadoop.apache.org>
Sent: Tuesday, May 19, 2009 1:16 PM
Subject: Re: hive hwi


> What does '${HIVE_HOME}/lib/hive_hwi.war' evaluate to? You can try
> specifying the full path in the hive-site.xml file.
>
> On Tue, May 19, 2009 at 12:05 AM, Manhee Jo <jo...@nttdocomo.com> wrote:
>> Thank you Edward,
>> I've used apache-ant to build hive that I checked out from trunk.
>> ant version is
>>
>> vm2:hive 22 % ant -version
>> Apache Ant version 1.7.1 compiled on November 10 2008
>>
>> vm2:hive 25 % echo $ANT_LIB
>> /usr/share/ant/lib:/usr/local/hive/build/dist/lib
>>
>> /usr/share/ant/lib has ant-*.jar
>> /usr/local/hive/build/dist/lib has hive_hwi.jar and hive_hwi.war as well 
>> as
>> other jar files.
>>
>> But still see the same errors while I was running Hadoop in pseudo
>> distributed mode.
>> So I need your help.
>>
>> Thanks,
>> Manhee
>>
>>
>> ----- Original Message ----- From: "Edward Capriolo" 
>> <ed...@gmail.com>
>> To: <hi...@hadoop.apache.org>
>> Sent: Tuesday, May 19, 2009 12:43 AM
>> Subject: Re: hive hwi
>>
>>
>> On Mon, May 18, 2009 at 4:33 AM, Manhee Jo <jo...@nttdocomo.com> wrote:
>>>
>>> Hi all. So, is this problem solved at last?
>>> In my environment, I have hive_hwi.war in ${HIVE_HOME}/lib.
>>> My ANT_LIB is /opt/ant/lib, where no files exist.
>>> Do I need to set ANT_LIB? If so, which directory?
>>>
>>> In addition, I've tried 'ant deploy'
>>> But target "deploy" does not exist in the project "hive."
>>> Any help would be appreciated.
>>>
>>>
>>> Thank you.
>>> Manhee
>>>
>>>
>>>
>>> 09/05/18 17:26:46 INFO hwi.HWIServer: HWI is starting up
>>> 09/05/18 17:26:46 FATAL hwi.HWIServer: HWI WAR file not found at
>>> ${HIVE_HOME}/lib/hive_hwi.war
>>> 09/05/18 17:26:46 INFO http.HttpServer: Version Jetty/5.1.4
>>> 09/05/18 17:26:46 INFO util.Credential: Checking Resource aliases
>>> 09/05/18 17:26:46 WARN servlet.WebApplicationContext: Web application 
>>> not
>>> found ${HIVE_HOME}/lib/hive_hwi.war
>>> 09/05/18 17:26:46 WARN servlet.WebApplicationContext: Configuration 
>>> error
>>> on
>>> ${HIVE_HOME}/lib/hive_hwi.war
>>> java.io.FileNotFoundException: ${HIVE_HOME}/lib/hive_hwi.war
>>> at
>>>
>>> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
>>> at
>>>
>>> org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
>>> at org.mortbay.util.Container.start(Container.java:72)
>>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
>>> at org.mortbay.util.Container.start(Container.java:72)
>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>>
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>> at
>>>
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>>> 09/05/18 17:26:46 INFO http.SocketListener: Started SocketListener on
>>> 0.0.0.0:9999
>>> 09/05/18 17:26:46 ERROR hwi.HWIServer: Parsing hwi.listen.port caused
>>> exception
>>> org.mortbay.util.MultiException[java.io.FileNotFoundException:
>>> ${HIVE_HOME}/lib/hive_hwi.war]
>>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:731)
>>> at org.mortbay.util.Container.start(Container.java:72)
>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>>
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>> at
>>>
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>>> java.io.FileNotFoundException: ${HIVE_HOME}/lib/hive_hwi.war
>>> at
>>>
>>> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
>>> at
>>>
>>> org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
>>> at org.mortbay.util.Container.start(Container.java:72)
>>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
>>> at org.mortbay.util.Container.start(Container.java:72)
>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>>
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>> at
>>>
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>>> java.io.IOException: Problem starting HWI server
>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:97)
>>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at
>>>
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>> at
>>>
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>> at java.lang.reflect.Method.invoke(Method.java:597)
>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>>> Caused by: 
>>> org.mortbay.util.MultiException[java.io.FileNotFoundException:
>>> ${HIVE_HOME}/lib/hive_hwi.war]
>>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:731)
>>> at org.mortbay.util.Container.start(Container.java:72)
>>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>>> ... 10 more
>>>
>>
>> Currently you need apache ant installed to start the web interface.
>> Currently Jetty requires ANT to load up the web application. Two
>> Jira's are open. One to include the required Jars. The other two
>> statically compile the Web Application hopefully removing the
>> requirement.
>>
>> For now please install apache-ant and point ANT_LIB to the location of
>> the ant JAR files. Let me know if you need any help.
>>
>>
>>
> 



Re: hive hwi

Posted by Edward Capriolo <ed...@gmail.com>.
What does '${HIVE_HOME}/lib/hive_hwi.war' evaluate to? You can try
specifying the full path in the hive-site.xml file.

On Tue, May 19, 2009 at 12:05 AM, Manhee Jo <jo...@nttdocomo.com> wrote:
> Thank you Edward,
> I've used apache-ant to build hive that I checked out from trunk.
> ant version is
>
> vm2:hive 22 % ant -version
> Apache Ant version 1.7.1 compiled on November 10 2008
>
> vm2:hive 25 % echo $ANT_LIB
> /usr/share/ant/lib:/usr/local/hive/build/dist/lib
>
> /usr/share/ant/lib has ant-*.jar
> /usr/local/hive/build/dist/lib has hive_hwi.jar and hive_hwi.war as well as
> other jar files.
>
> But still see the same errors while I was running Hadoop in pseudo
> distributed mode.
> So I need your help.
>
> Thanks,
> Manhee
>
>
> ----- Original Message ----- From: "Edward Capriolo" <ed...@gmail.com>
> To: <hi...@hadoop.apache.org>
> Sent: Tuesday, May 19, 2009 12:43 AM
> Subject: Re: hive hwi
>
>
> On Mon, May 18, 2009 at 4:33 AM, Manhee Jo <jo...@nttdocomo.com> wrote:
>>
>> Hi all. So, is this problem solved at last?
>> In my environment, I have hive_hwi.war in ${HIVE_HOME}/lib.
>> My ANT_LIB is /opt/ant/lib, where no files exist.
>> Do I need to set ANT_LIB? If so, which directory?
>>
>> In addition, I've tried 'ant deploy'
>> But target "deploy" does not exist in the project "hive."
>> Any help would be appreciated.
>>
>>
>> Thank you.
>> Manhee
>>
>>
>>
>> 09/05/18 17:26:46 INFO hwi.HWIServer: HWI is starting up
>> 09/05/18 17:26:46 FATAL hwi.HWIServer: HWI WAR file not found at
>> ${HIVE_HOME}/lib/hive_hwi.war
>> 09/05/18 17:26:46 INFO http.HttpServer: Version Jetty/5.1.4
>> 09/05/18 17:26:46 INFO util.Credential: Checking Resource aliases
>> 09/05/18 17:26:46 WARN servlet.WebApplicationContext: Web application not
>> found ${HIVE_HOME}/lib/hive_hwi.war
>> 09/05/18 17:26:46 WARN servlet.WebApplicationContext: Configuration error
>> on
>> ${HIVE_HOME}/lib/hive_hwi.war
>> java.io.FileNotFoundException: ${HIVE_HOME}/lib/hive_hwi.war
>> at
>>
>> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
>> at
>>
>> org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
>> at org.mortbay.util.Container.start(Container.java:72)
>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
>> at org.mortbay.util.Container.start(Container.java:72)
>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> at java.lang.reflect.Method.invoke(Method.java:597)
>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>> 09/05/18 17:26:46 INFO http.SocketListener: Started SocketListener on
>> 0.0.0.0:9999
>> 09/05/18 17:26:46 ERROR hwi.HWIServer: Parsing hwi.listen.port caused
>> exception
>> org.mortbay.util.MultiException[java.io.FileNotFoundException:
>> ${HIVE_HOME}/lib/hive_hwi.war]
>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:731)
>> at org.mortbay.util.Container.start(Container.java:72)
>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> at java.lang.reflect.Method.invoke(Method.java:597)
>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>> java.io.FileNotFoundException: ${HIVE_HOME}/lib/hive_hwi.war
>> at
>>
>> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
>> at
>>
>> org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
>> at org.mortbay.util.Container.start(Container.java:72)
>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
>> at org.mortbay.util.Container.start(Container.java:72)
>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> at java.lang.reflect.Method.invoke(Method.java:597)
>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>> java.io.IOException: Problem starting HWI server
>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:97)
>> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> at
>>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> at java.lang.reflect.Method.invoke(Method.java:597)
>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>> Caused by: org.mortbay.util.MultiException[java.io.FileNotFoundException:
>> ${HIVE_HOME}/lib/hive_hwi.war]
>> at org.mortbay.http.HttpServer.doStart(HttpServer.java:731)
>> at org.mortbay.util.Container.start(Container.java:72)
>> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>> ... 10 more
>>
>
> Currently you need apache ant installed to start the web interface.
> Currently Jetty requires ANT to load up the web application. Two
> Jira's are open. One to include the required Jars. The other two
> statically compile the Web Application hopefully removing the
> requirement.
>
> For now please install apache-ant and point ANT_LIB to the location of
> the ant JAR files. Let me know if you need any help.
>
>
>

Re: hive hwi

Posted by Manhee Jo <jo...@nttdocomo.com>.
Thank you Edward,
I've used apache-ant to build hive that I checked out from trunk.
ant version is

vm2:hive 22 % ant -version
Apache Ant version 1.7.1 compiled on November 10 2008

vm2:hive 25 % echo $ANT_LIB
/usr/share/ant/lib:/usr/local/hive/build/dist/lib

/usr/share/ant/lib has ant-*.jar
/usr/local/hive/build/dist/lib has hive_hwi.jar and hive_hwi.war as well as 
other jar files.

But still see the same errors while I was running Hadoop in pseudo 
distributed mode.
So I need your help.

Thanks,
Manhee


----- Original Message ----- 
From: "Edward Capriolo" <ed...@gmail.com>
To: <hi...@hadoop.apache.org>
Sent: Tuesday, May 19, 2009 12:43 AM
Subject: Re: hive hwi


On Mon, May 18, 2009 at 4:33 AM, Manhee Jo <jo...@nttdocomo.com> wrote:
> Hi all. So, is this problem solved at last?
> In my environment, I have hive_hwi.war in ${HIVE_HOME}/lib.
> My ANT_LIB is /opt/ant/lib, where no files exist.
> Do I need to set ANT_LIB? If so, which directory?
>
> In addition, I've tried 'ant deploy'
> But target "deploy" does not exist in the project "hive."
> Any help would be appreciated.
>
>
> Thank you.
> Manhee
>
>
>
> 09/05/18 17:26:46 INFO hwi.HWIServer: HWI is starting up
> 09/05/18 17:26:46 FATAL hwi.HWIServer: HWI WAR file not found at
> ${HIVE_HOME}/lib/hive_hwi.war
> 09/05/18 17:26:46 INFO http.HttpServer: Version Jetty/5.1.4
> 09/05/18 17:26:46 INFO util.Credential: Checking Resource aliases
> 09/05/18 17:26:46 WARN servlet.WebApplicationContext: Web application not
> found ${HIVE_HOME}/lib/hive_hwi.war
> 09/05/18 17:26:46 WARN servlet.WebApplicationContext: Configuration error 
> on
> ${HIVE_HOME}/lib/hive_hwi.war
> java.io.FileNotFoundException: ${HIVE_HOME}/lib/hive_hwi.war
> at
> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
> at
> org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
> at org.mortbay.util.Container.start(Container.java:72)
> at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
> at org.mortbay.util.Container.start(Container.java:72)
> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
> 09/05/18 17:26:46 INFO http.SocketListener: Started SocketListener on
> 0.0.0.0:9999
> 09/05/18 17:26:46 ERROR hwi.HWIServer: Parsing hwi.listen.port caused
> exception
> org.mortbay.util.MultiException[java.io.FileNotFoundException:
> ${HIVE_HOME}/lib/hive_hwi.war]
> at org.mortbay.http.HttpServer.doStart(HttpServer.java:731)
> at org.mortbay.util.Container.start(Container.java:72)
> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
> java.io.FileNotFoundException: ${HIVE_HOME}/lib/hive_hwi.war
> at
> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
> at
> org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
> at org.mortbay.util.Container.start(Container.java:72)
> at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
> at org.mortbay.util.Container.start(Container.java:72)
> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
> java.io.IOException: Problem starting HWI server
> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:97)
> at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
> Caused by: org.mortbay.util.MultiException[java.io.FileNotFoundException:
> ${HIVE_HOME}/lib/hive_hwi.war]
> at org.mortbay.http.HttpServer.doStart(HttpServer.java:731)
> at org.mortbay.util.Container.start(Container.java:72)
> at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
> ... 10 more
>

Currently you need apache ant installed to start the web interface.
Currently Jetty requires ANT to load up the web application. Two
Jira's are open. One to include the required Jars. The other two
statically compile the Web Application hopefully removing the
requirement.

For now please install apache-ant and point ANT_LIB to the location of
the ant JAR files. Let me know if you need any help.



Re: hive hwi

Posted by Edward Capriolo <ed...@gmail.com>.
On Mon, May 18, 2009 at 4:33 AM, Manhee Jo <jo...@nttdocomo.com> wrote:
> Hi all. So, is this problem solved at last?
> In my environment, I have hive_hwi.war in ${HIVE_HOME}/lib.
> My ANT_LIB is /opt/ant/lib, where no files exist.
> Do I need to set ANT_LIB? If so, which directory?
>
> In addition, I've tried 'ant deploy'
> But target "deploy" does not exist in the project  "hive."
> Any help would be appreciated.
>
>
> Thank you.
> Manhee
>
>
>
> 09/05/18 17:26:46 INFO hwi.HWIServer: HWI is starting up
> 09/05/18 17:26:46 FATAL hwi.HWIServer: HWI WAR file not found at
> ${HIVE_HOME}/lib/hive_hwi.war
> 09/05/18 17:26:46 INFO http.HttpServer: Version Jetty/5.1.4
> 09/05/18 17:26:46 INFO util.Credential: Checking Resource aliases
> 09/05/18 17:26:46 WARN servlet.WebApplicationContext: Web application not
> found ${HIVE_HOME}/lib/hive_hwi.war
> 09/05/18 17:26:46 WARN servlet.WebApplicationContext: Configuration error on
> ${HIVE_HOME}/lib/hive_hwi.war
> java.io.FileNotFoundException: ${HIVE_HOME}/lib/hive_hwi.war
>  at
> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
>  at
> org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
>  at org.mortbay.util.Container.start(Container.java:72)
>  at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
>  at org.mortbay.util.Container.start(Container.java:72)
>  at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>  at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>  at java.lang.reflect.Method.invoke(Method.java:597)
>  at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>  at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>  at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
> 09/05/18 17:26:46 INFO http.SocketListener: Started SocketListener on
> 0.0.0.0:9999
> 09/05/18 17:26:46 ERROR hwi.HWIServer: Parsing hwi.listen.port caused
> exception
> org.mortbay.util.MultiException[java.io.FileNotFoundException:
> ${HIVE_HOME}/lib/hive_hwi.war]
>  at org.mortbay.http.HttpServer.doStart(HttpServer.java:731)
>  at org.mortbay.util.Container.start(Container.java:72)
>  at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>  at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>  at java.lang.reflect.Method.invoke(Method.java:597)
>  at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>  at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>  at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
> java.io.FileNotFoundException: ${HIVE_HOME}/lib/hive_hwi.war
>  at
> org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
>  at
> org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
>  at org.mortbay.util.Container.start(Container.java:72)
>  at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
>  at org.mortbay.util.Container.start(Container.java:72)
>  at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>  at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>  at java.lang.reflect.Method.invoke(Method.java:597)
>  at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>  at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>  at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
> java.io.IOException: Problem starting HWI server
>  at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:97)
>  at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>  at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>  at java.lang.reflect.Method.invoke(Method.java:597)
>  at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>  at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>  at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
> Caused by: org.mortbay.util.MultiException[java.io.FileNotFoundException:
> ${HIVE_HOME}/lib/hive_hwi.war]
>  at org.mortbay.http.HttpServer.doStart(HttpServer.java:731)
>  at org.mortbay.util.Container.start(Container.java:72)
>  at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
>  ... 10 more
>

Currently you need apache ant installed to start the web interface.
Currently Jetty requires ANT to load up the web application. Two
Jira's are open. One to include the required Jars. The other two
statically compile the Web Application hopefully removing the
requirement.

For now please install apache-ant and point ANT_LIB to the location of
the ant JAR files. Let me know if you need any help.

hive hwi

Posted by Manhee Jo <jo...@nttdocomo.com>.
Hi all. So, is this problem solved at last?
In my environment, I have hive_hwi.war in ${HIVE_HOME}/lib.
My ANT_LIB is /opt/ant/lib, where no files exist.
Do I need to set ANT_LIB? If so, which directory?

In addition, I've tried 'ant deploy'
But target "deploy" does not exist in the project  "hive."
Any help would be appreciated.


Thank you.
Manhee



09/05/18 17:26:46 INFO hwi.HWIServer: HWI is starting up
09/05/18 17:26:46 FATAL hwi.HWIServer: HWI WAR file not found at 
${HIVE_HOME}/lib/hive_hwi.war
09/05/18 17:26:46 INFO http.HttpServer: Version Jetty/5.1.4
09/05/18 17:26:46 INFO util.Credential: Checking Resource aliases
09/05/18 17:26:46 WARN servlet.WebApplicationContext: Web application not 
found ${HIVE_HOME}/lib/hive_hwi.war
09/05/18 17:26:46 WARN servlet.WebApplicationContext: Configuration error on 
${HIVE_HOME}/lib/hive_hwi.war
java.io.FileNotFoundException: ${HIVE_HOME}/lib/hive_hwi.war
 at 
org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
 at 
org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
 at org.mortbay.util.Container.start(Container.java:72)
 at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
 at org.mortbay.util.Container.start(Container.java:72)
 at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
 at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
 at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
 at java.lang.reflect.Method.invoke(Method.java:597)
 at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
 at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
 at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
09/05/18 17:26:46 INFO http.SocketListener: Started SocketListener on 
0.0.0.0:9999
09/05/18 17:26:46 ERROR hwi.HWIServer: Parsing hwi.listen.port caused 
exception
org.mortbay.util.MultiException[java.io.FileNotFoundException: 
${HIVE_HOME}/lib/hive_hwi.war]
 at org.mortbay.http.HttpServer.doStart(HttpServer.java:731)
 at org.mortbay.util.Container.start(Container.java:72)
 at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
 at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
 at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
 at java.lang.reflect.Method.invoke(Method.java:597)
 at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
 at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
 at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
java.io.FileNotFoundException: ${HIVE_HOME}/lib/hive_hwi.war
 at 
org.mortbay.jetty.servlet.WebApplicationContext.resolveWebApp(WebApplicationContext.java:266)
 at 
org.mortbay.jetty.servlet.WebApplicationContext.doStart(WebApplicationContext.java:449)
 at org.mortbay.util.Container.start(Container.java:72)
 at org.mortbay.http.HttpServer.doStart(HttpServer.java:753)
 at org.mortbay.util.Container.start(Container.java:72)
 at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
 at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
 at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
 at java.lang.reflect.Method.invoke(Method.java:597)
 at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
 at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
 at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
java.io.IOException: Problem starting HWI server
 at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:97)
 at org.apache.hadoop.hive.hwi.HWIServer.main(HWIServer.java:115)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
 at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
 at java.lang.reflect.Method.invoke(Method.java:597)
 at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
 at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
 at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
Caused by: org.mortbay.util.MultiException[java.io.FileNotFoundException: 
${HIVE_HOME}/lib/hive_hwi.war]
 at org.mortbay.http.HttpServer.doStart(HttpServer.java:731)
 at org.mortbay.util.Container.start(Container.java:72)
 at org.apache.hadoop.hive.hwi.HWIServer.start(HWIServer.java:86)
 ... 10 more

Re: hive jdbc client usage?

Posted by Raghu Murthy <rm...@facebook.com>.
I have created a standalone program for hive jdbc at:
http://wiki.apache.org/hadoop/Hive/HiveClient#head-5b27b3a8f9f322945734f470d
1ae58f8afeaa0b4

Let me know if it works.

raghu

On 5/14/09 1:33 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:

> I started the server with
> aaron@jargon:~/src/ext/svn/hive-0.3.0/build/dist/bin$ HIVE_PORT=10000 ./hive
> --service hiveserver
> It appeared to start correctly.
> 
> Then ran the test using the ant command-line you gave me. It connected to the
> hiveserver (some output appeared there, including many SqlExceptions regarding
> indices that already exist), but the test fails:
> 
> test:
>     [junit] Running org.apache.hadoop.hive.jdbc.TestJdbcDriver
>     [junit] Hive history
> file=/home/aaron/src/ext/svn/hive-0.3.0/jdbc/../build/ql/tmp/hive_job_log_aaro
> n_200905141328_716976076.txt
>     [junit] Tests run: 1, Failures: 1, Errors: 0, Time elapsed: 1.836 sec
>     [junit] Test org.apache.hadoop.hive.jdbc.TestJdbcDriver FAILED
> 
> BUILD FAILED
> /home/aaron/src/ext/svn/hive-0.3.0/build.xml:166: The following error occurred
> while executing this line:
> /home/aaron/src/ext/svn/hive-0.3.0/build-common.xml:269: Tests failed!
> 
> I've attached the test log.
> - Aaron
> 
> On Wed, May 13, 2009 at 8:30 PM, Raghu Murthy <rm...@facebook.com> wrote:
>> Ok, one more test. Can you apply the attached patch and then run the
>> following?
>> 
>> 1. rebuild
>> 2. from dist/bin, run hive server on localhost port 10000
>> 3. from trunk, run ant test -Dtestcase=TestJdbcDriver -Dstandalone=true
>> 
>> Does the test succeed?
>> 
>> On 5/13/09 4:13 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:
>> 
>>>> I can in fact run the hive cli. I created a table named foo and can
>>> describe
>>>> it, select from it, etc.
>>>> 
>>>> I also tried to run 'SELECT * FROM foo' via JDBC and that failed as well.
>>>> - Aaron
>>>> 
>>>> On Wed, May 13, 2009 at 4:02 PM, Raghu Murthy <rm...@facebook.com> wrote:
>>>>>> Are you able to run the hive cli from the same installation? There are
>>>>>> currently some issues while running metadata-only calls (show, describe)
>>>>>> >>> via
>>>>>> JDBC. Regular queries should be fine though.
>>>>>> 
>>>>>> 
>>>>>> On 5/13/09 3:59 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:
>>>>>> 
>>>>>>>>>> Hi all,
>>>>>>>>>> 
>>>>>>>>>> I've been trying to use the Hive JDBC client today with some
>>>>>> frustration.
>>>>>>>>>>>> My
>>>>>>>>>> goal was to execute a simple "SHOW TABLES" statement in Hive.
>>>>>>>>>> 
>>>>>>>>>> If I start the Hive server with HIVE_PORT=10000 hive --service
>>>>>> hiveserver,
>>>>>>>>>>>> the
>>>>>>>>>> following happens when I connect to
>>>>>> jdbc:hive://localhost:10000/default:
>>>>>>>>>> java.sql.SQLException: Method not supported
>>>>>>>>>> 
>>>>>>>>>> If instead I attempt to connect to jdbc:hive:// (without the
>>>>>> standalone
>>>>>>>>>> hiveserver started), I get:
>>>>>>>>>> java.sql.SQLException:
>>>>>> MetaException(message:hive.metastore.warehouse.dir
>>>>>>>>>>>> is
>>>>>>>>>> not set in the config or blank)
>>>>>>>>>> 
>>>>>>>>>> I'm confused where I should set the hive.metastore.warehouse.dir
>>>>>> property.
>>>>>>>>>> I've run this from a directory containing a valid hive-site.xml; this
>>>>>>>>>> directory is named "conf/", so I also tried running my program in
>>>>>>>>>> that
>>>>>>>>>> directory's parent, thinking it may look for conf/hive-default.xml
>>>>>>>>>> and
>>>>>>>>>> conf/hive-site.xml.  How do I set the configuration files that will
>>>>>>>>>> be
>>>>>>>> loaded
>>>>>>>>>> inside the call to DriverManager.getConnection()? And in the case of
>>>>>>>>>> the
>>>>>>>>>> standalone server, does anyone have any insight into why I'd get
>>>>>> "method >>
>>>>>>>>>> not
>>>>>>>>>> supported" ?
>>>>>>>>>> 
>>>>>>>>>> FWIW, the program I ran was invoked via 'hadoop jar ...'; I don't
>>>>>> know if
>>>>>>>>>> launching a program in this way would mess up Hive's config paths,
>>>>>>>>>> etc.
>>>>>>>> This
>>>>>>>>>> is Hadoop 0.18.3, Hive 0.3.0.
>>>>>>>>>> 
>>>>>>>>>> Thanks,
>>>>>>>>>> - Aaron
>>>>>> 
>>>> 
>> 
> 


Re: hive jdbc client usage?

Posted by Aaron Kimball <aa...@cloudera.com>.
I started the server with aaron@jargon:~/src/ext/svn/hive-0.3.0/build/dist/bin$
HIVE_PORT=10000 ./hive --service hiveserver
It appeared to start correctly.

Then ran the test using the ant command-line you gave me. It connected to
the hiveserver (some output appeared there, including many SqlExceptions
regarding indices that already exist), but the test fails:

test:
    [junit] Running org.apache.hadoop.hive.jdbc.TestJdbcDriver
    [junit] Hive history
file=/home/aaron/src/ext/svn/hive-0.3.0/jdbc/../build/ql/tmp/hive_job_log_aaron_200905141328_716976076.txt
    [junit] Tests run: 1, Failures: 1, Errors: 0, Time elapsed: 1.836 sec
    [junit] Test org.apache.hadoop.hive.jdbc.TestJdbcDriver FAILED

BUILD FAILED
/home/aaron/src/ext/svn/hive-0.3.0/build.xml:166: The following error
occurred while executing this line:
/home/aaron/src/ext/svn/hive-0.3.0/build-common.xml:269: Tests failed!

I've attached the test log.
- Aaron

On Wed, May 13, 2009 at 8:30 PM, Raghu Murthy <rm...@facebook.com> wrote:

> Ok, one more test. Can you apply the attached patch and then run the
> following?
>
> 1. rebuild
> 2. from dist/bin, run hive server on localhost port 10000
> 3. from trunk, run ant test -Dtestcase=TestJdbcDriver -Dstandalone=true
>
> Does the test succeed?
>
> On 5/13/09 4:13 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:
>
> > I can in fact run the hive cli. I created a table named foo and can
> describe
> > it, select from it, etc.
> >
> > I also tried to run 'SELECT * FROM foo' via JDBC and that failed as well.
> > - Aaron
> >
> > On Wed, May 13, 2009 at 4:02 PM, Raghu Murthy <rm...@facebook.com>
> wrote:
> >> Are you able to run the hive cli from the same installation? There are
> >> currently some issues while running metadata-only calls (show, describe)
> via
> >> JDBC. Regular queries should be fine though.
> >>
> >>
> >> On 5/13/09 3:59 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:
> >>
> >>>> Hi all,
> >>>>
> >>>> I've been trying to use the Hive JDBC client today with some
> frustration.
> >>>> >> My
> >>>> goal was to execute a simple "SHOW TABLES" statement in Hive.
> >>>>
> >>>> If I start the Hive server with HIVE_PORT=10000 hive --service
> hiveserver,
> >>>> >> the
> >>>> following happens when I connect to
> jdbc:hive://localhost:10000/default:
> >>>> java.sql.SQLException: Method not supported
> >>>>
> >>>> If instead I attempt to connect to jdbc:hive:// (without the
> standalone
> >>>> hiveserver started), I get:
> >>>> java.sql.SQLException:
> MetaException(message:hive.metastore.warehouse.dir
> >>>> >> is
> >>>> not set in the config or blank)
> >>>>
> >>>> I'm confused where I should set the hive.metastore.warehouse.dir
> property.
> >>>> I've run this from a directory containing a valid hive-site.xml; this
> >>>> directory is named "conf/", so I also tried running my program in that
> >>>> directory's parent, thinking it may look for conf/hive-default.xml and
> >>>> conf/hive-site.xml.  How do I set the configuration files that will be
> >>> loaded
> >>>> inside the call to DriverManager.getConnection()? And in the case of
> the
> >>>> standalone server, does anyone have any insight into why I'd get
> "method >>
> >>>> not
> >>>> supported" ?
> >>>>
> >>>> FWIW, the program I ran was invoked via 'hadoop jar ...'; I don't know
> if
> >>>> launching a program in this way would mess up Hive's config paths,
> etc.
> >>> This
> >>>> is Hadoop 0.18.3, Hive 0.3.0.
> >>>>
> >>>> Thanks,
> >>>> - Aaron
> >>
> >
>
>

Re: hive jdbc client usage?

Posted by Raghu Murthy <rm...@facebook.com>.
Ok, one more test. Can you apply the attached patch and then run the
following?

1. rebuild
2. from dist/bin, run hive server on localhost port 10000
3. from trunk, run ant test -Dtestcase=TestJdbcDriver -Dstandalone=true

Does the test succeed?

On 5/13/09 4:13 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:

> I can in fact run the hive cli. I created a table named foo and can describe
> it, select from it, etc.
> 
> I also tried to run 'SELECT * FROM foo' via JDBC and that failed as well.
> - Aaron
> 
> On Wed, May 13, 2009 at 4:02 PM, Raghu Murthy <rm...@facebook.com> wrote:
>> Are you able to run the hive cli from the same installation? There are
>> currently some issues while running metadata-only calls (show, describe) via
>> JDBC. Regular queries should be fine though.
>> 
>> 
>> On 5/13/09 3:59 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:
>> 
>>>> Hi all,
>>>> 
>>>> I've been trying to use the Hive JDBC client today with some frustration.
>>>> >> My
>>>> goal was to execute a simple "SHOW TABLES" statement in Hive.
>>>> 
>>>> If I start the Hive server with HIVE_PORT=10000 hive --service hiveserver,
>>>> >> the
>>>> following happens when I connect to jdbc:hive://localhost:10000/default:
>>>> java.sql.SQLException: Method not supported
>>>> 
>>>> If instead I attempt to connect to jdbc:hive:// (without the standalone
>>>> hiveserver started), I get:
>>>> java.sql.SQLException: MetaException(message:hive.metastore.warehouse.dir
>>>> >> is
>>>> not set in the config or blank)
>>>> 
>>>> I'm confused where I should set the hive.metastore.warehouse.dir property.
>>>> I've run this from a directory containing a valid hive-site.xml; this
>>>> directory is named "conf/", so I also tried running my program in that
>>>> directory's parent, thinking it may look for conf/hive-default.xml and
>>>> conf/hive-site.xml.  How do I set the configuration files that will be
>>> loaded
>>>> inside the call to DriverManager.getConnection()? And in the case of the
>>>> standalone server, does anyone have any insight into why I'd get "method >>
>>>> not
>>>> supported" ?
>>>> 
>>>> FWIW, the program I ran was invoked via 'hadoop jar ...'; I don't know if
>>>> launching a program in this way would mess up Hive's config paths, etc.
>>> This
>>>> is Hadoop 0.18.3, Hive 0.3.0.
>>>> 
>>>> Thanks,
>>>> - Aaron
>> 
> 


Re: hive jdbc client usage?

Posted by Aaron Kimball <aa...@cloudera.com>.
I can in fact run the hive cli. I created a table named foo and can describe
it, select from it, etc.

I also tried to run 'SELECT * FROM foo' via JDBC and that failed as well.
- Aaron

On Wed, May 13, 2009 at 4:02 PM, Raghu Murthy <rm...@facebook.com> wrote:

> Are you able to run the hive cli from the same installation? There are
> currently some issues while running metadata-only calls (show, describe)
> via
> JDBC. Regular queries should be fine though.
>
>
> On 5/13/09 3:59 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:
>
> > Hi all,
> >
> > I've been trying to use the Hive JDBC client today with some frustration.
> My
> > goal was to execute a simple "SHOW TABLES" statement in Hive.
> >
> > If I start the Hive server with HIVE_PORT=10000 hive --service
> hiveserver, the
> > following happens when I connect to jdbc:hive://localhost:10000/default:
> > java.sql.SQLException: Method not supported
> >
> > If instead I attempt to connect to jdbc:hive:// (without the standalone
> > hiveserver started), I get:
> > java.sql.SQLException: MetaException(message:hive.metastore.warehouse.dir
> is
> > not set in the config or blank)
> >
> > I'm confused where I should set the hive.metastore.warehouse.dir
> property.
> > I've run this from a directory containing a valid hive-site.xml; this
> > directory is named "conf/", so I also tried running my program in that
> > directory's parent, thinking it may look for conf/hive-default.xml and
> > conf/hive-site.xml.  How do I set the configuration files that will be
> loaded
> > inside the call to DriverManager.getConnection()? And in the case of the
> > standalone server, does anyone have any insight into why I'd get "method
> not
> > supported" ?
> >
> > FWIW, the program I ran was invoked via 'hadoop jar ...'; I don't know if
> > launching a program in this way would mess up Hive's config paths, etc.
> This
> > is Hadoop 0.18.3, Hive 0.3.0.
> >
> > Thanks,
> > - Aaron
>
>

Re: hive jdbc client usage?

Posted by Raghu Murthy <rm...@facebook.com>.
Are you able to run the hive cli from the same installation? There are
currently some issues while running metadata-only calls (show, describe) via
JDBC. Regular queries should be fine though.


On 5/13/09 3:59 PM, "Aaron Kimball" <aa...@cloudera.com> wrote:

> Hi all,
> 
> I've been trying to use the Hive JDBC client today with some frustration. My
> goal was to execute a simple "SHOW TABLES" statement in Hive.
> 
> If I start the Hive server with HIVE_PORT=10000 hive --service hiveserver, the
> following happens when I connect to jdbc:hive://localhost:10000/default:
> java.sql.SQLException: Method not supported
> 
> If instead I attempt to connect to jdbc:hive:// (without the standalone
> hiveserver started), I get:
> java.sql.SQLException: MetaException(message:hive.metastore.warehouse.dir is
> not set in the config or blank)
> 
> I'm confused where I should set the hive.metastore.warehouse.dir property.
> I've run this from a directory containing a valid hive-site.xml; this
> directory is named "conf/", so I also tried running my program in that
> directory's parent, thinking it may look for conf/hive-default.xml and
> conf/hive-site.xml.  How do I set the configuration files that will be loaded
> inside the call to DriverManager.getConnection()? And in the case of the
> standalone server, does anyone have any insight into why I'd get "method not
> supported" ?
> 
> FWIW, the program I ran was invoked via 'hadoop jar ...'; I don't know if
> launching a program in this way would mess up Hive's config paths, etc. This
> is Hadoop 0.18.3, Hive 0.3.0.
> 
> Thanks,
> - Aaron