You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Zheng Shao <zs...@gmail.com> on 2010/02/05 08:11:38 UTC

Re: Hive Installation Problem

Try this:

cd ~/.ant/cache/hadoop/core/sources
wget http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz


Zheng

On Thu, Feb 4, 2010 at 10:23 PM, baburaj.S <ba...@onmobile.com> wrote:
> Hello ,
>
> I am new to Hadoop and is trying to install Hive now. We have the following setup at our side
>
> OS - Ubuntu 9.10
> Hadoop - 0.20.1
> Hive installation tried - 0.4.0 .
>
> The Hadoop is installed and is working fine . Now when we were installing Hive I got error that it couldn't resolve the dependencies. I changed the shims build and properties xml to make the dependencies look for Hadoop 0.20.1 . But now when I call the ant script I get the following error
>
> ivy-retrieve-hadoop-source:
> [ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 :: http://ant.apache.org/ivy/ :
> :: loading settings :: file = /master/hive/ivy/ivysettings.xml
> [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working
> [ivy:retrieve]  confs: [default]
> [ivy:retrieve] :: resolution report :: resolve 953885ms :: artifacts dl 0ms
>        ---------------------------------------------------------------------
>        |                  |            modules            ||   artifacts   |
>        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
>        ---------------------------------------------------------------------
>        |      default     |   1   |   0   |   0   |   0   ||   0   |   0   |
>        ---------------------------------------------------------------------
> [ivy:retrieve]
> [ivy:retrieve] :: problems summary ::
> [ivy:retrieve] :::: WARNINGS
> [ivy:retrieve]          module not found: hadoop#core;0.20.1
> [ivy:retrieve]  ==== hadoop-source: tried
> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
> [ivy:retrieve]    http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
> [ivy:retrieve]  ==== apache-snapshot: tried
> [ivy:retrieve]    https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
> [ivy:retrieve]    https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
> [ivy:retrieve]  ==== maven2: tried
> [ivy:retrieve]    http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
> [ivy:retrieve]    http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
> [ivy:retrieve]          ::          UNRESOLVED DEPENDENCIES         ::
> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
> [ivy:retrieve]          :: hadoop#core;0.20.1: not found
> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
> [ivy:retrieve] :::: ERRORS
> [ivy:retrieve]  Server access Error: Connection timed out url=http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
> [ivy:retrieve]  Server access Error: Connection timed out url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
> [ivy:retrieve]  Server access Error: Connection timed out url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
> [ivy:retrieve]  Server access Error: Connection timed out url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
> [ivy:retrieve]  Server access Error: Connection timed out url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
> [ivy:retrieve]
> [ivy:retrieve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
>
> BUILD FAILED
> /master/hive/build.xml:148: The following error occurred while executing this line:
> /master/hive/build.xml:93: The following error occurred while executing this line:
> /master/hive/shims/build.xml:64: The following error occurred while executing this line:
> /master/hive/build-common.xml:172: impossible to resolve dependencies:
>        resolve failed - see output for details
>
> Total time: 15 minutes 55 seconds
>
>
> I have even tried to download hadoop-0.20.1.tar.gz and put it in the ant cache of the user . Still the same error is repeated. I am stuck and not able to install it .
>
> Any help on the above will be greatly appreciated.
>
> Babu
>
>
> DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
> Thank you - OnMobile Global Limited.
>



-- 
Yours,
Zheng

heads up on ivy upgrade

Posted by John Sichi <js...@facebook.com>.
Hi all,

Zheng just committed my patch for HIVE-1120, which upgrades ivy from 2.0 to 2.1 and refines the new "offline" mode for the Hive build.

After updating your sandbox with this patch, you'll need to delete your IVY_HOME directory (typically ~/.ant unless you have set it explicitly), otherwise you'll get errors the next time you try to run "ant package".  See JIRA for an example of the error message.

Unfortunately, this will mean that ivy will have to re-download the big hadoop dependencies on your next build, and as a number of people have reported recently, this seems to be a little flaky currently.  This patch won't improve that situation (since from what I've seen the flakiness comes from the source repositories), but it shouldn't make it any worse, and once you get it and successfully re-download, you should be able to add ANT_ARGS="-Doffline=true" to your shell environment and successfully work disconnected after that.

For the flakiness, I'm going to take a look at ivysettings.xml to see if we can improve the repository situation via mirroring.

JVS


heads up on ivy upgrade

Posted by John Sichi <js...@facebook.com>.
Hi all,

Zheng just committed my patch for HIVE-1120, which upgrades ivy from 2.0 to 2.1 and refines the new "offline" mode for the Hive build.

After updating your sandbox with this patch, you'll need to delete your IVY_HOME directory (typically ~/.ant unless you have set it explicitly), otherwise you'll get errors the next time you try to run "ant package".  See JIRA for an example of the error message.

Unfortunately, this will mean that ivy will have to re-download the big hadoop dependencies on your next build, and as a number of people have reported recently, this seems to be a little flaky currently.  This patch won't improve that situation (since from what I've seen the flakiness comes from the source repositories), but it shouldn't make it any worse, and once you get it and successfully re-download, you should be able to add ANT_ARGS="-Doffline=true" to your shell environment and successfully work disconnected after that.

For the flakiness, I'm going to take a look at ivysettings.xml to see if we can improve the repository situation via mirroring.

JVS


Re: Hive Installation Problem

Posted by Zheng Shao <zs...@gmail.com>.
Did you ever set HADOOP_CLASSPATH or CLASSPATH?
You might want to unset those parameters and have a try.


Zheng

On Mon, Feb 8, 2010 at 3:55 AM, Vidyasagar Venkata Nallapati
<vi...@onmobile.com> wrote:
> Hi Edward,
>
> On Running even thorough the build 4.1
> We are getting
>
> phoenix@ph5:/master/hadoop/hive-0.4.1-bin$ bin/hive
> Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
>        at java.lang.Class.forName0(Native Method)
>        at java.lang.Class.forName(Class.java:247)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
>        at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
>        at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>        at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
>        at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
>        ... 3 more
>
> Regards
> Vidyasagar N V
>
> -----Original Message-----
> From: Edward Capriolo [mailto:edlinuxguru@gmail.com]
> Sent: Sunday, February 07, 2010 9:13 PM
> To: hive-user@hadoop.apache.org
> Subject: Re: Hive Installation Problem
>
> On Sun, Feb 7, 2010 at 8:51 AM, Vidyasagar Venkata Nallapati
> <vi...@onmobile.com> wrote:
>> Hi,
>>
>> I am getting the problem while downloading the .pom file from
>>
>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.0/core-0.20.0.pom
>> ERROR: Server access Error: Connection timed out url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.0/core-0.20.0.pom
>>
>> Is this reporsitory changed?
>>
>> Regards
>> Vidyasagar N V
>>
>> -----Original Message-----
>> From: John Sichi [mailto:jsichi@facebook.com]
>> Sent: Saturday, February 06, 2010 7:39 AM
>> To: hive-user@hadoop.apache.org
>> Subject: Re: Hive Installation Problem
>>
>> By the way, the current IVY_HOME detection in build-common.xml is broken because it doesn't do:
>>
>> <property environment="env"/>
>>
>> first.
>>
>> I'll log a JIRA issue for it, but it seems there are other problems with it even after I fix that since the build is currenlty installing ivy under build/ivy rather than under ${ivy.home}; nothing else in build-common.xml references ivy.home.
>>
>> JVS
>>
>> On Feb 5, 2010, at 1:15 PM, Zheng Shao wrote:
>>
>>> HI guys,
>>>
>>> Can you have a try to make the following directory the same as mine?
>>> Once this is done, remove the "build" directory, and run "ant package".
>>>
>>> Does this solve the problem?
>>>
>>>
>>>
>>> [zshao@dev ~/.ant] ls -lR
>>> .:
>>> total 3896
>>> drwxr-xr-x  2 zshao users    4096 Feb  5 13:04 apache-ivy-2.0.0-rc2
>>> -rw-r--r--  1 zshao users 3965953 Nov  4  2008 apache-ivy-2.0.0-rc2-bin.zip
>>> -rw-r--r--  1 zshao users       0 Feb  5 13:04 apache-ivy-2.0.0-rc2.installed
>>> drwxr-xr-x  3 zshao users    4096 Feb  5 13:07 cache
>>> drwxr-xr-x  2 zshao users    4096 Feb  5 13:04 lib
>>>
>>> ./apache-ivy-2.0.0-rc2:
>>> total 880
>>> -rw-r--r--  1 zshao users 893199 Oct 28  2008 ivy-2.0.0-rc2.jar
>>>
>>> ./cache:
>>> total 4
>>> drwxr-xr-x  3 zshao users 4096 Feb  4 19:30 hadoop
>>>
>>> ./cache/hadoop:
>>> total 4
>>> drwxr-xr-x  3 zshao users 4096 Feb  5 13:08 core
>>>
>>> ./cache/hadoop/core:
>>> total 4
>>> drwxr-xr-x  2 zshao users 4096 Feb  4 19:30 sources
>>>
>>> ./cache/hadoop/core/sources:
>>> total 127436
>>> -rw-r--r--  1 zshao users 14427013 Aug 20  2008 hadoop-0.17.2.1.tar.gz
>>> -rw-r--r--  1 zshao users 30705253 Jan 22  2009 hadoop-0.18.3.tar.gz
>>> -rw-r--r--  1 zshao users 42266180 Nov 13  2008 hadoop-0.19.0.tar.gz
>>> -rw-r--r--  1 zshao users 42813980 Apr  8  2009 hadoop-0.20.0.tar.gz
>>>
>>> ./lib:
>>> total 880
>>> -rw-r--r--  1 zshao users 893199 Feb  5 13:04 ivy-2.0.0-rc2.jar
>>>
>>>
>>> Zheng
>>>
>>> On Fri, Feb 5, 2010 at 5:49 AM, Vidyasagar Venkata Nallapati
>>> <vi...@onmobile.com> wrote:
>>>> Hi ,
>>>>
>>>>
>>>>
>>>> We are still getting the problem
>>>>
>>>>
>>>>
>>>> [ivy:retrieve] no resolved descriptor found: launching default resolve
>>>>
>>>> Overriding previous definition of property "ivy.version"
>>>>
>>>> [ivy:retrieve] using ivy parser to parse
>>>> file:/master/hadoop/hive/shims/ivy.xml
>>>>
>>>> [ivy:retrieve] :: resolving dependencies ::
>>>> org.apache.hadoop.hive#shims;working@ph1
>>>>
>>>> [ivy:retrieve]  confs: [default]
>>>>
>>>> [ivy:retrieve]  validate = true
>>>>
>>>> [ivy:retrieve]  refresh = false
>>>>
>>>> [ivy:retrieve] resolving dependencies for configuration 'default'
>>>>
>>>> [ivy:retrieve] == resolving dependencies for
>>>> org.apache.hadoop.hive#shims;working@ph1 [default]
>>>>
>>>> [ivy:retrieve] == resolving dependencies
>>>> org.apache.hadoop.hive#shims;working@ph1->hadoop#core;0.20.1 [default->*]
>>>>
>>>> [ivy:retrieve] default: Checking cache for: dependency: hadoop#core;0.20.1
>>>> {*=[*]}
>>>>
>>>> [ivy:retrieve]  hadoop-source: no ivy file nor artifact found for
>>>> hadoop#core;0.20.1
>>>>
>>>> [ivy:retrieve]          tried
>>>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>>>
>>>>
>>>>
>>>> And the .pom for this is not getting copied, please suggest something on
>>>> this.
>>>>
>>>>
>>>>
>>>> Regards
>>>>
>>>> Vidyasagar N V
>>>>
>>>>
>>>>
>>>> From: baburaj.S [mailto:baburaj.s@onmobile.com]
>>>> Sent: Friday, February 05, 2010 4:59 PM
>>>>
>>>> To: hive-user@hadoop.apache.org
>>>> Subject: RE: Hive Installation Problem
>>>>
>>>>
>>>>
>>>> No I don't have the variable defined. Any other things that I have to check.
>>>> Is this happening because I am trying for Hadoop 0.20.1
>>>>
>>>>
>>>>
>>>> Babu
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> From: Carl Steinbach [mailto:carl@cloudera.com]
>>>> Sent: Friday, February 05, 2010 3:07 PM
>>>> To: hive-user@hadoop.apache.org
>>>> Subject: Re: Hive Installation Problem
>>>>
>>>>
>>>>
>>>> Hi Babu,
>>>>
>>>> ~/.ant/cache is the default Ivy cache directory for Hive, but if the
>>>> environment variable IVY_HOME
>>>> is set it will use $IVY_HOME/cache instead. Is it possible that you have
>>>> this environment
>>>> variable set to a value different than ~/.ant?
>>>>
>>>> On Fri, Feb 5, 2010 at 12:09 AM, baburaj.S <ba...@onmobile.com> wrote:
>>>>
>>>> I have tried the same but still the installation is giving the same error. I
>>>> don't know if it is looking in the cache . Can we make any change in
>>>> ivysettings.xml that it has to resolve the file from the file system rather
>>>> through an url.
>>>>
>>>> Babu
>>>>
>>>> -----Original Message-----
>>>> From: Zheng Shao [mailto:zshao9@gmail.com]
>>>> Sent: Friday, February 05, 2010 12:47 PM
>>>> To: hive-user@hadoop.apache.org
>>>> Subject: Re: Hive Installation Problem
>>>>
>>>> Added to http://wiki.apache.org/hadoop/Hive/FAQ
>>>>
>>>> Zheng
>>>>
>>>> On Thu, Feb 4, 2010 at 11:11 PM, Zheng Shao <zs...@gmail.com> wrote:
>>>>> Try this:
>>>>>
>>>>> cd ~/.ant/cache/hadoop/core/sources
>>>>> wget
>>>>> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>>>
>>>>>
>>>>> Zheng
>>>>>
>>>>> On Thu, Feb 4, 2010 at 10:23 PM, baburaj.S <ba...@onmobile.com> wrote:
>>>>>> Hello ,
>>>>>>
>>>>>> I am new to Hadoop and is trying to install Hive now. We have the
>>>>>> following setup at our side
>>>>>>
>>>>>> OS - Ubuntu 9.10
>>>>>> Hadoop - 0.20.1
>>>>>> Hive installation tried - 0.4.0 .
>>>>>>
>>>>>> The Hadoop is installed and is working fine . Now when we were installing
>>>>>> Hive I got error that it couldn't resolve the dependencies. I changed the
>>>>>> shims build and properties xml to make the dependencies look for Hadoop
>>>>>> 0.20.1 . But now when I call the ant script I get the following error
>>>>>>
>>>>>> ivy-retrieve-hadoop-source:
>>>>>> [ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 ::
>>>>>> http://ant.apache.org/ivy/ :
>>>>>> :: loading settings :: file = /master/hive/ivy/ivysettings.xml
>>>>>> [ivy:retrieve] :: resolving dependencies ::
>>>>>> org.apache.hadoop.hive#shims;working
>>>>>> [ivy:retrieve]  confs: [default]
>>>>>> [ivy:retrieve] :: resolution report :: resolve 953885ms :: artifacts dl
>>>>>> 0ms
>>>>>>
>>>>>> ---------------------------------------------------------------------
>>>>>>       |                  |            modules            ||   artifacts
>>>>>>  |
>>>>>>       |       conf       | number| search|dwnlded|evicted||
>>>>>> number|dwnlded|
>>>>>>
>>>>>> ---------------------------------------------------------------------
>>>>>>       |      default     |   1   |   0   |   0   |   0   ||   0   |   0
>>>>>>  |
>>>>>>
>>>>>> ---------------------------------------------------------------------
>>>>>> [ivy:retrieve]
>>>>>> [ivy:retrieve] :: problems summary ::
>>>>>> [ivy:retrieve] :::: WARNINGS
>>>>>> [ivy:retrieve]          module not found: hadoop#core;0.20.1
>>>>>> [ivy:retrieve]  ==== hadoop-source: tried
>>>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>>>> [ivy:retrieve]
>>>>>> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>>>> [ivy:retrieve]  ==== apache-snapshot: tried
>>>>>> [ivy:retrieve]
>>>>>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>>>> [ivy:retrieve]
>>>>>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>>>>>> [ivy:retrieve]  ==== maven2: tried
>>>>>> [ivy:retrieve]
>>>>>> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>>>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>>>> [ivy:retrieve]
>>>>>> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>>>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>>>> [ivy:retrieve]          ::          UNRESOLVED DEPENDENCIES         ::
>>>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>>>> [ivy:retrieve]          :: hadoop#core;0.20.1: not found
>>>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>>>> [ivy:retrieve] :::: ERRORS
>>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>>> url=http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>>> url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>>> url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>>> url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>>> url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>>>>>> [ivy:retrieve]
>>>>>> [ivy:retrieve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
>>>>>>
>>>>>> BUILD FAILED
>>>>>> /master/hive/build.xml:148: The following error occurred while executing
>>>>>> this line:
>>>>>> /master/hive/build.xml:93: The following error occurred while executing
>>>>>> this line:
>>>>>> /master/hive/shims/build.xml:64: The following error occurred while
>>>>>> executing this line:
>>>>>> /master/hive/build-common.xml:172: impossible to resolve dependencies:
>>>>>>       resolve failed - see output for details
>>>>>>
>>>>>> Total time: 15 minutes 55 seconds
>>>>>>
>>>>>>
>>>>>> I have even tried to download hadoop-0.20.1.tar.gz and put it in the ant
>>>>>> cache of the user . Still the same error is repeated. I am stuck and not
>>>>>> able to install it .
>>>>>>
>>>>>> Any help on the above will be greatly appreciated.
>>>>>>
>>>>>> Babu
>>>>>>
>>>>>>
>>>>>> DISCLAIMER: The information in this message is confidential and may be
>>>>>> legally privileged. It is intended solely for the addressee. Access to this
>>>>>> message by anyone else is unauthorized. If you are not the intended
>>>>>> recipient, any disclosure, copying, or distribution of the message, or any
>>>>>> action or omission taken by you in reliance on it, is prohibited and may be
>>>>>> unlawful. Please immediately contact the sender if you have received this
>>>>>> message in error. Further, this e-mail may contain viruses and all
>>>>>> reasonable precaution to minimize the risk arising there from is taken by
>>>>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>>>>> of any virus in this e-mail. All applicable virus checks should be carried
>>>>>> out by you before opening this e-mail or any attachment thereto.
>>>>>> Thank you - OnMobile Global Limited.
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Yours,
>>>>> Zheng
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Yours,
>>>> Zheng
>>>>
>>>> DISCLAIMER: The information in this message is confidential and may be
>>>> legally privileged. It is intended solely for the addressee. Access to this
>>>> message by anyone else is unauthorized. If you are not the intended
>>>> recipient, any disclosure, copying, or distribution of the message, or any
>>>> action or omission taken by you in reliance on it, is prohibited and may be
>>>> unlawful. Please immediately contact the sender if you have received this
>>>> message in error. Further, this e-mail may contain viruses and all
>>>> reasonable precaution to minimize the risk arising there from is taken by
>>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>>> of any virus in this e-mail. All applicable virus checks should be carried
>>>> out by you before opening this e-mail or any attachment thereto.
>>>> Thank you - OnMobile Global Limited.
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> ________________________________
>>>>
>>>> DISCLAIMER: The information in this message is confidential and may be
>>>> legally privileged. It is intended solely for the addressee. Access to this
>>>> message by anyone else is unauthorized. If you are not the intended
>>>> recipient, any disclosure, copying, or distribution of the message, or any
>>>> action or omission taken by you in reliance on it, is prohibited and may be
>>>> unlawful. Please immediately contact the sender if you have received this
>>>> message in error. Further, this e-mail may contain viruses and all
>>>> reasonable precaution to minimize the risk arising there from is taken by
>>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>>> of any virus in this e-mail. All applicable virus checks should be carried
>>>> out by you before opening this e-mail or any attachment thereto.
>>>> Thank you - OnMobile Global Limited.
>>>>
>>>> ________________________________
>>>> DISCLAIMER: The information in this message is confidential and may be
>>>> legally privileged. It is intended solely for the addressee. Access to this
>>>> message by anyone else is unauthorized. If you are not the intended
>>>> recipient, any disclosure, copying, or distribution of the message, or any
>>>> action or omission taken by you in reliance on it, is prohibited and may be
>>>> unlawful. Please immediately contact the sender if you have received this
>>>> message in error. Further, this e-mail may contain viruses and all
>>>> reasonable precaution to minimize the risk arising there from is taken by
>>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>>> of any virus in this e-mail. All applicable virus checks should be carried
>>>> out by you before opening this e-mail or any attachment thereto.
>>>> Thank you - OnMobile Global Limited.
>>>>
>>>
>>>
>>>
>>> --
>>> Yours,
>>> Zheng
>>
>>
>> DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
>> Thank you - OnMobile Global Limited.
>>
>
> Not that this solves your problem, but Have you considered using a
> release instead of building from trunk?
>
> DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
> Thank you - OnMobile Global Limited.
>



-- 
Yours,
Zheng

Re: Hive Installation Problem

Posted by Yi Mao <ym...@gmail.com>.
Do you have "/master/hadoop/hive-0.4.1-bin" in the classpath of hadoop?

On Mon, Feb 8, 2010 at 8:42 PM, baburaj.S <ba...@onmobile.com> wrote:

> Hello All,
>
> We are stuck and not able to install hive because of the following problem.
> Can anyone help please.
>
> Babu
>
>
> -----Original Message-----
> From: Vidyasagar Venkata Nallapati [mailto:
> vidyasagar.nallapati@onmobile.com]
> Sent: Monday, February 08, 2010 5:25 PM
> To: hive-user@hadoop.apache.org
> Subject: RE: Hive Installation Problem
>
> Hi Edward,
>
> On Running even thorough the build 4.1
> We are getting
>
> phoenix@ph5:/master/hadoop/hive-0.4.1-bin$ bin/hive
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/hive/conf/HiveConf
>        at java.lang.Class.forName0(Native Method)
>        at java.lang.Class.forName(Class.java:247)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.hive.conf.HiveConf
>        at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
>        at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>        at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
>        at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
>        ... 3 more
>
> Regards
> Vidyasagar N V
>
> -----Original Message-----
> From: Edward Capriolo [mailto:edlinuxguru@gmail.com]
> Sent: Sunday, February 07, 2010 9:13 PM
> To: hive-user@hadoop.apache.org
> Subject: Re: Hive Installation Problem
>
> On Sun, Feb 7, 2010 at 8:51 AM, Vidyasagar Venkata Nallapati
> <vi...@onmobile.com> wrote:
> > Hi,
> >
> > I am getting the problem while downloading the .pom file from
> >
> >
> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.0/core-0.20.0.pom
> > ERROR: Server access Error: Connection timed out url=
> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.0/core-0.20.0.pom
> >
> > Is this reporsitory changed?
> >
> > Regards
> > Vidyasagar N V
> >
> > -----Original Message-----
> > From: John Sichi [mailto:jsichi@facebook.com]
> > Sent: Saturday, February 06, 2010 7:39 AM
> > To: hive-user@hadoop.apache.org
> > Subject: Re: Hive Installation Problem
> >
> > By the way, the current IVY_HOME detection in build-common.xml is broken
> because it doesn't do:
> >
> > <property environment="env"/>
> >
> > first.
> >
> > I'll log a JIRA issue for it, but it seems there are other problems with
> it even after I fix that since the build is currenlty installing ivy under
> build/ivy rather than under ${ivy.home}; nothing else in build-common.xml
> references ivy.home.
> >
> > JVS
> >
> > On Feb 5, 2010, at 1:15 PM, Zheng Shao wrote:
> >
> >> HI guys,
> >>
> >> Can you have a try to make the following directory the same as mine?
> >> Once this is done, remove the "build" directory, and run "ant package".
> >>
> >> Does this solve the problem?
> >>
> >>
> >>
> >> [zshao@dev ~/.ant] ls -lR
> >> .:
> >> total 3896
> >> drwxr-xr-x  2 zshao users    4096 Feb  5 13:04 apache-ivy-2.0.0-rc2
> >> -rw-r--r--  1 zshao users 3965953 Nov  4  2008
> apache-ivy-2.0.0-rc2-bin.zip
> >> -rw-r--r--  1 zshao users       0 Feb  5 13:04
> apache-ivy-2.0.0-rc2.installed
> >> drwxr-xr-x  3 zshao users    4096 Feb  5 13:07 cache
> >> drwxr-xr-x  2 zshao users    4096 Feb  5 13:04 lib
> >>
> >> ./apache-ivy-2.0.0-rc2:
> >> total 880
> >> -rw-r--r--  1 zshao users 893199 Oct 28  2008 ivy-2.0.0-rc2.jar
> >>
> >> ./cache:
> >> total 4
> >> drwxr-xr-x  3 zshao users 4096 Feb  4 19:30 hadoop
> >>
> >> ./cache/hadoop:
> >> total 4
> >> drwxr-xr-x  3 zshao users 4096 Feb  5 13:08 core
> >>
> >> ./cache/hadoop/core:
> >> total 4
> >> drwxr-xr-x  2 zshao users 4096 Feb  4 19:30 sources
> >>
> >> ./cache/hadoop/core/sources:
> >> total 127436
> >> -rw-r--r--  1 zshao users 14427013 Aug 20  2008 hadoop-0.17.2.1.tar.gz
> >> -rw-r--r--  1 zshao users 30705253 Jan 22  2009 hadoop-0.18.3.tar.gz
> >> -rw-r--r--  1 zshao users 42266180 Nov 13  2008 hadoop-0.19.0.tar.gz
> >> -rw-r--r--  1 zshao users 42813980 Apr  8  2009 hadoop-0.20.0.tar.gz
> >>
> >> ./lib:
> >> total 880
> >> -rw-r--r--  1 zshao users 893199 Feb  5 13:04 ivy-2.0.0-rc2.jar
> >>
> >>
> >> Zheng
> >>
> >> On Fri, Feb 5, 2010 at 5:49 AM, Vidyasagar Venkata Nallapati
> >> <vi...@onmobile.com> wrote:
> >>> Hi ,
> >>>
> >>>
> >>>
> >>> We are still getting the problem
> >>>
> >>>
> >>>
> >>> [ivy:retrieve] no resolved descriptor found: launching default resolve
> >>>
> >>> Overriding previous definition of property "ivy.version"
> >>>
> >>> [ivy:retrieve] using ivy parser to parse
> >>> file:/master/hadoop/hive/shims/ivy.xml
> >>>
> >>> [ivy:retrieve] :: resolving dependencies ::
> >>> org.apache.hadoop.hive#shims;working@ph1
> >>>
> >>> [ivy:retrieve]  confs: [default]
> >>>
> >>> [ivy:retrieve]  validate = true
> >>>
> >>> [ivy:retrieve]  refresh = false
> >>>
> >>> [ivy:retrieve] resolving dependencies for configuration 'default'
> >>>
> >>> [ivy:retrieve] == resolving dependencies for
> >>> org.apache.hadoop.hive#shims;working@ph1 [default]
> >>>
> >>> [ivy:retrieve] == resolving dependencies
> >>> org.apache.hadoop.hive#shims;working@ph1->hadoop#core;0.20.1
> [default->*]
> >>>
> >>> [ivy:retrieve] default: Checking cache for: dependency:
> hadoop#core;0.20.1
> >>> {*=[*]}
> >>>
> >>> [ivy:retrieve]  hadoop-source: no ivy file nor artifact found for
> >>> hadoop#core;0.20.1
> >>>
> >>> [ivy:retrieve]          tried
> >>>
> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
> >>>
> >>>
> >>>
> >>> And the .pom for this is not getting copied, please suggest something
> on
> >>> this.
> >>>
> >>>
> >>>
> >>> Regards
> >>>
> >>> Vidyasagar N V
> >>>
> >>>
> >>>
> >>> From: baburaj.S [mailto:baburaj.s@onmobile.com]
> >>> Sent: Friday, February 05, 2010 4:59 PM
> >>>
> >>> To: hive-user@hadoop.apache.org
> >>> Subject: RE: Hive Installation Problem
> >>>
> >>>
> >>>
> >>> No I don't have the variable defined. Any other things that I have to
> check.
> >>> Is this happening because I am trying for Hadoop 0.20.1
> >>>
> >>>
> >>>
> >>> Babu
> >>>
> >>>
> >>>
> >>>
> >>>
> >>> From: Carl Steinbach [mailto:carl@cloudera.com]
> >>> Sent: Friday, February 05, 2010 3:07 PM
> >>> To: hive-user@hadoop.apache.org
> >>> Subject: Re: Hive Installation Problem
> >>>
> >>>
> >>>
> >>> Hi Babu,
> >>>
> >>> ~/.ant/cache is the default Ivy cache directory for Hive, but if the
> >>> environment variable IVY_HOME
> >>> is set it will use $IVY_HOME/cache instead. Is it possible that you
> have
> >>> this environment
> >>> variable set to a value different than ~/.ant?
> >>>
> >>> On Fri, Feb 5, 2010 at 12:09 AM, baburaj.S <ba...@onmobile.com>
> wrote:
> >>>
> >>> I have tried the same but still the installation is giving the same
> error. I
> >>> don't know if it is looking in the cache . Can we make any change in
> >>> ivysettings.xml that it has to resolve the file from the file system
> rather
> >>> through an url.
> >>>
> >>> Babu
> >>>
> >>> -----Original Message-----
> >>> From: Zheng Shao [mailto:zshao9@gmail.com]
> >>> Sent: Friday, February 05, 2010 12:47 PM
> >>> To: hive-user@hadoop.apache.org
> >>> Subject: Re: Hive Installation Problem
> >>>
> >>> Added to http://wiki.apache.org/hadoop/Hive/FAQ
> >>>
> >>> Zheng
> >>>
> >>> On Thu, Feb 4, 2010 at 11:11 PM, Zheng Shao <zs...@gmail.com> wrote:
> >>>> Try this:
> >>>>
> >>>> cd ~/.ant/cache/hadoop/core/sources
> >>>> wget
> >>>>
> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
> >>>>
> >>>>
> >>>> Zheng
> >>>>
> >>>> On Thu, Feb 4, 2010 at 10:23 PM, baburaj.S <ba...@onmobile.com>
> wrote:
> >>>>> Hello ,
> >>>>>
> >>>>> I am new to Hadoop and is trying to install Hive now. We have the
> >>>>> following setup at our side
> >>>>>
> >>>>> OS - Ubuntu 9.10
> >>>>> Hadoop - 0.20.1
> >>>>> Hive installation tried - 0.4.0 .
> >>>>>
> >>>>> The Hadoop is installed and is working fine . Now when we were
> installing
> >>>>> Hive I got error that it couldn't resolve the dependencies. I changed
> the
> >>>>> shims build and properties xml to make the dependencies look for
> Hadoop
> >>>>> 0.20.1 . But now when I call the ant script I get the following error
> >>>>>
> >>>>> ivy-retrieve-hadoop-source:
> >>>>> [ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 ::
> >>>>> http://ant.apache.org/ivy/ :
> >>>>> :: loading settings :: file = /master/hive/ivy/ivysettings.xml
> >>>>> [ivy:retrieve] :: resolving dependencies ::
> >>>>> org.apache.hadoop.hive#shims;working
> >>>>> [ivy:retrieve]  confs: [default]
> >>>>> [ivy:retrieve] :: resolution report :: resolve 953885ms :: artifacts
> dl
> >>>>> 0ms
> >>>>>
> >>>>> ---------------------------------------------------------------------
> >>>>>       |                  |            modules            ||
> artifacts
> >>>>>  |
> >>>>>       |       conf       | number| search|dwnlded|evicted||
> >>>>> number|dwnlded|
> >>>>>
> >>>>> ---------------------------------------------------------------------
> >>>>>       |      default     |   1   |   0   |   0   |   0   ||   0   |
> 0
> >>>>>  |
> >>>>>
> >>>>> ---------------------------------------------------------------------
> >>>>> [ivy:retrieve]
> >>>>> [ivy:retrieve] :: problems summary ::
> >>>>> [ivy:retrieve] :::: WARNINGS
> >>>>> [ivy:retrieve]          module not found: hadoop#core;0.20.1
> >>>>> [ivy:retrieve]  ==== hadoop-source: tried
> >>>>> [ivy:retrieve]    -- artifact
> hadoop#core;0.20.1!hadoop.tar.gz(source):
> >>>>> [ivy:retrieve]
> >>>>>
> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
> >>>>> [ivy:retrieve]  ==== apache-snapshot: tried
> >>>>> [ivy:retrieve]
> >>>>>
> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
> >>>>> [ivy:retrieve]    -- artifact
> hadoop#core;0.20.1!hadoop.tar.gz(source):
> >>>>> [ivy:retrieve]
> >>>>>
> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
> >>>>> [ivy:retrieve]  ==== maven2: tried
> >>>>> [ivy:retrieve]
> >>>>> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
> >>>>> [ivy:retrieve]    -- artifact
> hadoop#core;0.20.1!hadoop.tar.gz(source):
> >>>>> [ivy:retrieve]
> >>>>> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
> >>>>> [ivy:retrieve]
>  ::::::::::::::::::::::::::::::::::::::::::::::
> >>>>> [ivy:retrieve]          ::          UNRESOLVED DEPENDENCIES
> ::
> >>>>> [ivy:retrieve]
>  ::::::::::::::::::::::::::::::::::::::::::::::
> >>>>> [ivy:retrieve]          :: hadoop#core;0.20.1: not found
> >>>>> [ivy:retrieve]
>  ::::::::::::::::::::::::::::::::::::::::::::::
> >>>>> [ivy:retrieve] :::: ERRORS
> >>>>> [ivy:retrieve]  Server access Error: Connection timed out
> >>>>> url=
> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
> >>>>> [ivy:retrieve]  Server access Error: Connection timed out
> >>>>> url=
> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
> >>>>> [ivy:retrieve]  Server access Error: Connection timed out
> >>>>> url=
> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
> >>>>> [ivy:retrieve]  Server access Error: Connection timed out
> >>>>> url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
> >>>>> [ivy:retrieve]  Server access Error: Connection timed out
> >>>>> url=
> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
> >>>>> [ivy:retrieve]
> >>>>> [ivy:retrieve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
> >>>>>
> >>>>> BUILD FAILED
> >>>>> /master/hive/build.xml:148: The following error occurred while
> executing
> >>>>> this line:
> >>>>> /master/hive/build.xml:93: The following error occurred while
> executing
> >>>>> this line:
> >>>>> /master/hive/shims/build.xml:64: The following error occurred while
> >>>>> executing this line:
> >>>>> /master/hive/build-common.xml:172: impossible to resolve
> dependencies:
> >>>>>       resolve failed - see output for details
> >>>>>
> >>>>> Total time: 15 minutes 55 seconds
> >>>>>
> >>>>>
> >>>>> I have even tried to download hadoop-0.20.1.tar.gz and put it in the
> ant
> >>>>> cache of the user . Still the same error is repeated. I am stuck and
> not
> >>>>> able to install it .
> >>>>>
> >>>>> Any help on the above will be greatly appreciated.
> >>>>>
> >>>>> Babu
> >>>>>
> >>>>>
> >>>>> DISCLAIMER: The information in this message is confidential and may
> be
> >>>>> legally privileged. It is intended solely for the addressee. Access
> to this
> >>>>> message by anyone else is unauthorized. If you are not the intended
> >>>>> recipient, any disclosure, copying, or distribution of the message,
> or any
> >>>>> action or omission taken by you in reliance on it, is prohibited and
> may be
> >>>>> unlawful. Please immediately contact the sender if you have received
> this
> >>>>> message in error. Further, this e-mail may contain viruses and all
> >>>>> reasonable precaution to minimize the risk arising there from is
> taken by
> >>>>> OnMobile. OnMobile is not liable for any damage sustained by you as a
> result
> >>>>> of any virus in this e-mail. All applicable virus checks should be
> carried
> >>>>> out by you before opening this e-mail or any attachment thereto.
> >>>>> Thank you - OnMobile Global Limited.
> >>>>>
> >>>>
> >>>>
> >>>>
> >>>> --
> >>>> Yours,
> >>>> Zheng
> >>>>
> >>>
> >>>
> >>>
> >>> --
> >>> Yours,
> >>> Zheng
> >>>
> >>> DISCLAIMER: The information in this message is confidential and may be
> >>> legally privileged. It is intended solely for the addressee. Access to
> this
> >>> message by anyone else is unauthorized. If you are not the intended
> >>> recipient, any disclosure, copying, or distribution of the message, or
> any
> >>> action or omission taken by you in reliance on it, is prohibited and
> may be
> >>> unlawful. Please immediately contact the sender if you have received
> this
> >>> message in error. Further, this e-mail may contain viruses and all
> >>> reasonable precaution to minimize the risk arising there from is taken
> by
> >>> OnMobile. OnMobile is not liable for any damage sustained by you as a
> result
> >>> of any virus in this e-mail. All applicable virus checks should be
> carried
> >>> out by you before opening this e-mail or any attachment thereto.
> >>> Thank you - OnMobile Global Limited.
> >>>
> >>>
> >>>
> >>>
> >>>
> >>> ________________________________
> >>>
> >>> DISCLAIMER: The information in this message is confidential and may be
> >>> legally privileged. It is intended solely for the addressee. Access to
> this
> >>> message by anyone else is unauthorized. If you are not the intended
> >>> recipient, any disclosure, copying, or distribution of the message, or
> any
> >>> action or omission taken by you in reliance on it, is prohibited and
> may be
> >>> unlawful. Please immediately contact the sender if you have received
> this
> >>> message in error. Further, this e-mail may contain viruses and all
> >>> reasonable precaution to minimize the risk arising there from is taken
> by
> >>> OnMobile. OnMobile is not liable for any damage sustained by you as a
> result
> >>> of any virus in this e-mail. All applicable virus checks should be
> carried
> >>> out by you before opening this e-mail or any attachment thereto.
> >>> Thank you - OnMobile Global Limited.
> >>>
> >>> ________________________________
> >>> DISCLAIMER: The information in this message is confidential and may be
> >>> legally privileged. It is intended solely for the addressee. Access to
> this
> >>> message by anyone else is unauthorized. If you are not the intended
> >>> recipient, any disclosure, copying, or distribution of the message, or
> any
> >>> action or omission taken by you in reliance on it, is prohibited and
> may be
> >>> unlawful. Please immediately contact the sender if you have received
> this
> >>> message in error. Further, this e-mail may contain viruses and all
> >>> reasonable precaution to minimize the risk arising there from is taken
> by
> >>> OnMobile. OnMobile is not liable for any damage sustained by you as a
> result
> >>> of any virus in this e-mail. All applicable virus checks should be
> carried
> >>> out by you before opening this e-mail or any attachment thereto.
> >>> Thank you - OnMobile Global Limited.
> >>>
> >>
> >>
> >>
> >> --
> >> Yours,
> >> Zheng
> >
> >
> > DISCLAIMER: The information in this message is confidential and may be
> legally privileged. It is intended solely for the addressee. Access to this
> message by anyone else is unauthorized. If you are not the intended
> recipient, any disclosure, copying, or distribution of the message, or any
> action or omission taken by you in reliance on it, is prohibited and may be
> unlawful. Please immediately contact the sender if you have received this
> message in error. Further, this e-mail may contain viruses and all
> reasonable precaution to minimize the risk arising there from is taken by
> OnMobile. OnMobile is not liable for any damage sustained by you as a result
> of any virus in this e-mail. All applicable virus checks should be carried
> out by you before opening this e-mail or any attachment thereto.
> > Thank you - OnMobile Global Limited.
> >
>
> Not that this solves your problem, but Have you considered using a
> release instead of building from trunk?
>
> DISCLAIMER: The information in this message is confidential and may be
> legally privileged. It is intended solely for the addressee. Access to this
> message by anyone else is unauthorized. If you are not the intended
> recipient, any disclosure, copying, or distribution of the message, or any
> action or omission taken by you in reliance on it, is prohibited and may be
> unlawful. Please immediately contact the sender if you have received this
> message in error. Further, this e-mail may contain viruses and all
> reasonable precaution to minimize the risk arising there from is taken by
> OnMobile. OnMobile is not liable for any damage sustained by you as a result
> of any virus in this e-mail. All applicable virus checks should be carried
> out by you before opening this e-mail or any attachment thereto.
> Thank you - OnMobile Global Limited.
>
> DISCLAIMER: The information in this message is confidential and may be
> legally privileged. It is intended solely for the addressee. Access to this
> message by anyone else is unauthorized. If you are not the intended
> recipient, any disclosure, copying, or distribution of the message, or any
> action or omission taken by you in reliance on it, is prohibited and may be
> unlawful. Please immediately contact the sender if you have received this
> message in error. Further, this e-mail may contain viruses and all
> reasonable precaution to minimize the risk arising there from is taken by
> OnMobile. OnMobile is not liable for any damage sustained by you as a result
> of any virus in this e-mail. All applicable virus checks should be carried
> out by you before opening this e-mail or any attachment thereto.
> Thank you - OnMobile Global Limited.
>

RE: Hive Installation Problem

Posted by "baburaj.S" <ba...@onmobile.com>.
Hello All,

We are stuck and not able to install hive because of the following problem. Can anyone help please.

Babu


-----Original Message-----
From: Vidyasagar Venkata Nallapati [mailto:vidyasagar.nallapati@onmobile.com]
Sent: Monday, February 08, 2010 5:25 PM
To: hive-user@hadoop.apache.org
Subject: RE: Hive Installation Problem

Hi Edward,

On Running even thorough the build 4.1
We are getting

phoenix@ph5:/master/hadoop/hive-0.4.1-bin$ bin/hive
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:247)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
        at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
        at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
        ... 3 more

Regards
Vidyasagar N V

-----Original Message-----
From: Edward Capriolo [mailto:edlinuxguru@gmail.com]
Sent: Sunday, February 07, 2010 9:13 PM
To: hive-user@hadoop.apache.org
Subject: Re: Hive Installation Problem

On Sun, Feb 7, 2010 at 8:51 AM, Vidyasagar Venkata Nallapati
<vi...@onmobile.com> wrote:
> Hi,
>
> I am getting the problem while downloading the .pom file from
>
> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.0/core-0.20.0.pom
> ERROR: Server access Error: Connection timed out url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.0/core-0.20.0.pom
>
> Is this reporsitory changed?
>
> Regards
> Vidyasagar N V
>
> -----Original Message-----
> From: John Sichi [mailto:jsichi@facebook.com]
> Sent: Saturday, February 06, 2010 7:39 AM
> To: hive-user@hadoop.apache.org
> Subject: Re: Hive Installation Problem
>
> By the way, the current IVY_HOME detection in build-common.xml is broken because it doesn't do:
>
> <property environment="env"/>
>
> first.
>
> I'll log a JIRA issue for it, but it seems there are other problems with it even after I fix that since the build is currenlty installing ivy under build/ivy rather than under ${ivy.home}; nothing else in build-common.xml references ivy.home.
>
> JVS
>
> On Feb 5, 2010, at 1:15 PM, Zheng Shao wrote:
>
>> HI guys,
>>
>> Can you have a try to make the following directory the same as mine?
>> Once this is done, remove the "build" directory, and run "ant package".
>>
>> Does this solve the problem?
>>
>>
>>
>> [zshao@dev ~/.ant] ls -lR
>> .:
>> total 3896
>> drwxr-xr-x  2 zshao users    4096 Feb  5 13:04 apache-ivy-2.0.0-rc2
>> -rw-r--r--  1 zshao users 3965953 Nov  4  2008 apache-ivy-2.0.0-rc2-bin.zip
>> -rw-r--r--  1 zshao users       0 Feb  5 13:04 apache-ivy-2.0.0-rc2.installed
>> drwxr-xr-x  3 zshao users    4096 Feb  5 13:07 cache
>> drwxr-xr-x  2 zshao users    4096 Feb  5 13:04 lib
>>
>> ./apache-ivy-2.0.0-rc2:
>> total 880
>> -rw-r--r--  1 zshao users 893199 Oct 28  2008 ivy-2.0.0-rc2.jar
>>
>> ./cache:
>> total 4
>> drwxr-xr-x  3 zshao users 4096 Feb  4 19:30 hadoop
>>
>> ./cache/hadoop:
>> total 4
>> drwxr-xr-x  3 zshao users 4096 Feb  5 13:08 core
>>
>> ./cache/hadoop/core:
>> total 4
>> drwxr-xr-x  2 zshao users 4096 Feb  4 19:30 sources
>>
>> ./cache/hadoop/core/sources:
>> total 127436
>> -rw-r--r--  1 zshao users 14427013 Aug 20  2008 hadoop-0.17.2.1.tar.gz
>> -rw-r--r--  1 zshao users 30705253 Jan 22  2009 hadoop-0.18.3.tar.gz
>> -rw-r--r--  1 zshao users 42266180 Nov 13  2008 hadoop-0.19.0.tar.gz
>> -rw-r--r--  1 zshao users 42813980 Apr  8  2009 hadoop-0.20.0.tar.gz
>>
>> ./lib:
>> total 880
>> -rw-r--r--  1 zshao users 893199 Feb  5 13:04 ivy-2.0.0-rc2.jar
>>
>>
>> Zheng
>>
>> On Fri, Feb 5, 2010 at 5:49 AM, Vidyasagar Venkata Nallapati
>> <vi...@onmobile.com> wrote:
>>> Hi ,
>>>
>>>
>>>
>>> We are still getting the problem
>>>
>>>
>>>
>>> [ivy:retrieve] no resolved descriptor found: launching default resolve
>>>
>>> Overriding previous definition of property "ivy.version"
>>>
>>> [ivy:retrieve] using ivy parser to parse
>>> file:/master/hadoop/hive/shims/ivy.xml
>>>
>>> [ivy:retrieve] :: resolving dependencies ::
>>> org.apache.hadoop.hive#shims;working@ph1
>>>
>>> [ivy:retrieve]  confs: [default]
>>>
>>> [ivy:retrieve]  validate = true
>>>
>>> [ivy:retrieve]  refresh = false
>>>
>>> [ivy:retrieve] resolving dependencies for configuration 'default'
>>>
>>> [ivy:retrieve] == resolving dependencies for
>>> org.apache.hadoop.hive#shims;working@ph1 [default]
>>>
>>> [ivy:retrieve] == resolving dependencies
>>> org.apache.hadoop.hive#shims;working@ph1->hadoop#core;0.20.1 [default->*]
>>>
>>> [ivy:retrieve] default: Checking cache for: dependency: hadoop#core;0.20.1
>>> {*=[*]}
>>>
>>> [ivy:retrieve]  hadoop-source: no ivy file nor artifact found for
>>> hadoop#core;0.20.1
>>>
>>> [ivy:retrieve]          tried
>>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>>
>>>
>>>
>>> And the .pom for this is not getting copied, please suggest something on
>>> this.
>>>
>>>
>>>
>>> Regards
>>>
>>> Vidyasagar N V
>>>
>>>
>>>
>>> From: baburaj.S [mailto:baburaj.s@onmobile.com]
>>> Sent: Friday, February 05, 2010 4:59 PM
>>>
>>> To: hive-user@hadoop.apache.org
>>> Subject: RE: Hive Installation Problem
>>>
>>>
>>>
>>> No I don't have the variable defined. Any other things that I have to check.
>>> Is this happening because I am trying for Hadoop 0.20.1
>>>
>>>
>>>
>>> Babu
>>>
>>>
>>>
>>>
>>>
>>> From: Carl Steinbach [mailto:carl@cloudera.com]
>>> Sent: Friday, February 05, 2010 3:07 PM
>>> To: hive-user@hadoop.apache.org
>>> Subject: Re: Hive Installation Problem
>>>
>>>
>>>
>>> Hi Babu,
>>>
>>> ~/.ant/cache is the default Ivy cache directory for Hive, but if the
>>> environment variable IVY_HOME
>>> is set it will use $IVY_HOME/cache instead. Is it possible that you have
>>> this environment
>>> variable set to a value different than ~/.ant?
>>>
>>> On Fri, Feb 5, 2010 at 12:09 AM, baburaj.S <ba...@onmobile.com> wrote:
>>>
>>> I have tried the same but still the installation is giving the same error. I
>>> don't know if it is looking in the cache . Can we make any change in
>>> ivysettings.xml that it has to resolve the file from the file system rather
>>> through an url.
>>>
>>> Babu
>>>
>>> -----Original Message-----
>>> From: Zheng Shao [mailto:zshao9@gmail.com]
>>> Sent: Friday, February 05, 2010 12:47 PM
>>> To: hive-user@hadoop.apache.org
>>> Subject: Re: Hive Installation Problem
>>>
>>> Added to http://wiki.apache.org/hadoop/Hive/FAQ
>>>
>>> Zheng
>>>
>>> On Thu, Feb 4, 2010 at 11:11 PM, Zheng Shao <zs...@gmail.com> wrote:
>>>> Try this:
>>>>
>>>> cd ~/.ant/cache/hadoop/core/sources
>>>> wget
>>>> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>>
>>>>
>>>> Zheng
>>>>
>>>> On Thu, Feb 4, 2010 at 10:23 PM, baburaj.S <ba...@onmobile.com> wrote:
>>>>> Hello ,
>>>>>
>>>>> I am new to Hadoop and is trying to install Hive now. We have the
>>>>> following setup at our side
>>>>>
>>>>> OS - Ubuntu 9.10
>>>>> Hadoop - 0.20.1
>>>>> Hive installation tried - 0.4.0 .
>>>>>
>>>>> The Hadoop is installed and is working fine . Now when we were installing
>>>>> Hive I got error that it couldn't resolve the dependencies. I changed the
>>>>> shims build and properties xml to make the dependencies look for Hadoop
>>>>> 0.20.1 . But now when I call the ant script I get the following error
>>>>>
>>>>> ivy-retrieve-hadoop-source:
>>>>> [ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 ::
>>>>> http://ant.apache.org/ivy/ :
>>>>> :: loading settings :: file = /master/hive/ivy/ivysettings.xml
>>>>> [ivy:retrieve] :: resolving dependencies ::
>>>>> org.apache.hadoop.hive#shims;working
>>>>> [ivy:retrieve]  confs: [default]
>>>>> [ivy:retrieve] :: resolution report :: resolve 953885ms :: artifacts dl
>>>>> 0ms
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>>       |                  |            modules            ||   artifacts
>>>>>  |
>>>>>       |       conf       | number| search|dwnlded|evicted||
>>>>> number|dwnlded|
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>>       |      default     |   1   |   0   |   0   |   0   ||   0   |   0
>>>>>  |
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> [ivy:retrieve]
>>>>> [ivy:retrieve] :: problems summary ::
>>>>> [ivy:retrieve] :::: WARNINGS
>>>>> [ivy:retrieve]          module not found: hadoop#core;0.20.1
>>>>> [ivy:retrieve]  ==== hadoop-source: tried
>>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>>> [ivy:retrieve]
>>>>> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>>> [ivy:retrieve]  ==== apache-snapshot: tried
>>>>> [ivy:retrieve]
>>>>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>>> [ivy:retrieve]
>>>>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>>>>> [ivy:retrieve]  ==== maven2: tried
>>>>> [ivy:retrieve]
>>>>> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>>> [ivy:retrieve]
>>>>> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>>> [ivy:retrieve]          ::          UNRESOLVED DEPENDENCIES         ::
>>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>>> [ivy:retrieve]          :: hadoop#core;0.20.1: not found
>>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>>> [ivy:retrieve] :::: ERRORS
>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>> url=http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>> url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>> url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>> url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>> url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>>>>> [ivy:retrieve]
>>>>> [ivy:retrieve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
>>>>>
>>>>> BUILD FAILED
>>>>> /master/hive/build.xml:148: The following error occurred while executing
>>>>> this line:
>>>>> /master/hive/build.xml:93: The following error occurred while executing
>>>>> this line:
>>>>> /master/hive/shims/build.xml:64: The following error occurred while
>>>>> executing this line:
>>>>> /master/hive/build-common.xml:172: impossible to resolve dependencies:
>>>>>       resolve failed - see output for details
>>>>>
>>>>> Total time: 15 minutes 55 seconds
>>>>>
>>>>>
>>>>> I have even tried to download hadoop-0.20.1.tar.gz and put it in the ant
>>>>> cache of the user . Still the same error is repeated. I am stuck and not
>>>>> able to install it .
>>>>>
>>>>> Any help on the above will be greatly appreciated.
>>>>>
>>>>> Babu
>>>>>
>>>>>
>>>>> DISCLAIMER: The information in this message is confidential and may be
>>>>> legally privileged. It is intended solely for the addressee. Access to this
>>>>> message by anyone else is unauthorized. If you are not the intended
>>>>> recipient, any disclosure, copying, or distribution of the message, or any
>>>>> action or omission taken by you in reliance on it, is prohibited and may be
>>>>> unlawful. Please immediately contact the sender if you have received this
>>>>> message in error. Further, this e-mail may contain viruses and all
>>>>> reasonable precaution to minimize the risk arising there from is taken by
>>>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>>>> of any virus in this e-mail. All applicable virus checks should be carried
>>>>> out by you before opening this e-mail or any attachment thereto.
>>>>> Thank you - OnMobile Global Limited.
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Yours,
>>>> Zheng
>>>>
>>>
>>>
>>>
>>> --
>>> Yours,
>>> Zheng
>>>
>>> DISCLAIMER: The information in this message is confidential and may be
>>> legally privileged. It is intended solely for the addressee. Access to this
>>> message by anyone else is unauthorized. If you are not the intended
>>> recipient, any disclosure, copying, or distribution of the message, or any
>>> action or omission taken by you in reliance on it, is prohibited and may be
>>> unlawful. Please immediately contact the sender if you have received this
>>> message in error. Further, this e-mail may contain viruses and all
>>> reasonable precaution to minimize the risk arising there from is taken by
>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>> of any virus in this e-mail. All applicable virus checks should be carried
>>> out by you before opening this e-mail or any attachment thereto.
>>> Thank you - OnMobile Global Limited.
>>>
>>>
>>>
>>>
>>>
>>> ________________________________
>>>
>>> DISCLAIMER: The information in this message is confidential and may be
>>> legally privileged. It is intended solely for the addressee. Access to this
>>> message by anyone else is unauthorized. If you are not the intended
>>> recipient, any disclosure, copying, or distribution of the message, or any
>>> action or omission taken by you in reliance on it, is prohibited and may be
>>> unlawful. Please immediately contact the sender if you have received this
>>> message in error. Further, this e-mail may contain viruses and all
>>> reasonable precaution to minimize the risk arising there from is taken by
>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>> of any virus in this e-mail. All applicable virus checks should be carried
>>> out by you before opening this e-mail or any attachment thereto.
>>> Thank you - OnMobile Global Limited.
>>>
>>> ________________________________
>>> DISCLAIMER: The information in this message is confidential and may be
>>> legally privileged. It is intended solely for the addressee. Access to this
>>> message by anyone else is unauthorized. If you are not the intended
>>> recipient, any disclosure, copying, or distribution of the message, or any
>>> action or omission taken by you in reliance on it, is prohibited and may be
>>> unlawful. Please immediately contact the sender if you have received this
>>> message in error. Further, this e-mail may contain viruses and all
>>> reasonable precaution to minimize the risk arising there from is taken by
>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>> of any virus in this e-mail. All applicable virus checks should be carried
>>> out by you before opening this e-mail or any attachment thereto.
>>> Thank you - OnMobile Global Limited.
>>>
>>
>>
>>
>> --
>> Yours,
>> Zheng
>
>
> DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
> Thank you - OnMobile Global Limited.
>

Not that this solves your problem, but Have you considered using a
release instead of building from trunk?

DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
Thank you - OnMobile Global Limited.

DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
Thank you - OnMobile Global Limited.

RE: Hive Installation Problem

Posted by Vidyasagar Venkata Nallapati <vi...@onmobile.com>.
Hi Edward,

On Running even thorough the build 4.1
We are getting

phoenix@ph5:/master/hadoop/hive-0.4.1-bin$ bin/hive
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:247)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
        at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
        at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
        ... 3 more

Regards
Vidyasagar N V

-----Original Message-----
From: Edward Capriolo [mailto:edlinuxguru@gmail.com]
Sent: Sunday, February 07, 2010 9:13 PM
To: hive-user@hadoop.apache.org
Subject: Re: Hive Installation Problem

On Sun, Feb 7, 2010 at 8:51 AM, Vidyasagar Venkata Nallapati
<vi...@onmobile.com> wrote:
> Hi,
>
> I am getting the problem while downloading the .pom file from
>
> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.0/core-0.20.0.pom
> ERROR: Server access Error: Connection timed out url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.0/core-0.20.0.pom
>
> Is this reporsitory changed?
>
> Regards
> Vidyasagar N V
>
> -----Original Message-----
> From: John Sichi [mailto:jsichi@facebook.com]
> Sent: Saturday, February 06, 2010 7:39 AM
> To: hive-user@hadoop.apache.org
> Subject: Re: Hive Installation Problem
>
> By the way, the current IVY_HOME detection in build-common.xml is broken because it doesn't do:
>
> <property environment="env"/>
>
> first.
>
> I'll log a JIRA issue for it, but it seems there are other problems with it even after I fix that since the build is currenlty installing ivy under build/ivy rather than under ${ivy.home}; nothing else in build-common.xml references ivy.home.
>
> JVS
>
> On Feb 5, 2010, at 1:15 PM, Zheng Shao wrote:
>
>> HI guys,
>>
>> Can you have a try to make the following directory the same as mine?
>> Once this is done, remove the "build" directory, and run "ant package".
>>
>> Does this solve the problem?
>>
>>
>>
>> [zshao@dev ~/.ant] ls -lR
>> .:
>> total 3896
>> drwxr-xr-x  2 zshao users    4096 Feb  5 13:04 apache-ivy-2.0.0-rc2
>> -rw-r--r--  1 zshao users 3965953 Nov  4  2008 apache-ivy-2.0.0-rc2-bin.zip
>> -rw-r--r--  1 zshao users       0 Feb  5 13:04 apache-ivy-2.0.0-rc2.installed
>> drwxr-xr-x  3 zshao users    4096 Feb  5 13:07 cache
>> drwxr-xr-x  2 zshao users    4096 Feb  5 13:04 lib
>>
>> ./apache-ivy-2.0.0-rc2:
>> total 880
>> -rw-r--r--  1 zshao users 893199 Oct 28  2008 ivy-2.0.0-rc2.jar
>>
>> ./cache:
>> total 4
>> drwxr-xr-x  3 zshao users 4096 Feb  4 19:30 hadoop
>>
>> ./cache/hadoop:
>> total 4
>> drwxr-xr-x  3 zshao users 4096 Feb  5 13:08 core
>>
>> ./cache/hadoop/core:
>> total 4
>> drwxr-xr-x  2 zshao users 4096 Feb  4 19:30 sources
>>
>> ./cache/hadoop/core/sources:
>> total 127436
>> -rw-r--r--  1 zshao users 14427013 Aug 20  2008 hadoop-0.17.2.1.tar.gz
>> -rw-r--r--  1 zshao users 30705253 Jan 22  2009 hadoop-0.18.3.tar.gz
>> -rw-r--r--  1 zshao users 42266180 Nov 13  2008 hadoop-0.19.0.tar.gz
>> -rw-r--r--  1 zshao users 42813980 Apr  8  2009 hadoop-0.20.0.tar.gz
>>
>> ./lib:
>> total 880
>> -rw-r--r--  1 zshao users 893199 Feb  5 13:04 ivy-2.0.0-rc2.jar
>>
>>
>> Zheng
>>
>> On Fri, Feb 5, 2010 at 5:49 AM, Vidyasagar Venkata Nallapati
>> <vi...@onmobile.com> wrote:
>>> Hi ,
>>>
>>>
>>>
>>> We are still getting the problem
>>>
>>>
>>>
>>> [ivy:retrieve] no resolved descriptor found: launching default resolve
>>>
>>> Overriding previous definition of property "ivy.version"
>>>
>>> [ivy:retrieve] using ivy parser to parse
>>> file:/master/hadoop/hive/shims/ivy.xml
>>>
>>> [ivy:retrieve] :: resolving dependencies ::
>>> org.apache.hadoop.hive#shims;working@ph1
>>>
>>> [ivy:retrieve]  confs: [default]
>>>
>>> [ivy:retrieve]  validate = true
>>>
>>> [ivy:retrieve]  refresh = false
>>>
>>> [ivy:retrieve] resolving dependencies for configuration 'default'
>>>
>>> [ivy:retrieve] == resolving dependencies for
>>> org.apache.hadoop.hive#shims;working@ph1 [default]
>>>
>>> [ivy:retrieve] == resolving dependencies
>>> org.apache.hadoop.hive#shims;working@ph1->hadoop#core;0.20.1 [default->*]
>>>
>>> [ivy:retrieve] default: Checking cache for: dependency: hadoop#core;0.20.1
>>> {*=[*]}
>>>
>>> [ivy:retrieve]  hadoop-source: no ivy file nor artifact found for
>>> hadoop#core;0.20.1
>>>
>>> [ivy:retrieve]          tried
>>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>>
>>>
>>>
>>> And the .pom for this is not getting copied, please suggest something on
>>> this.
>>>
>>>
>>>
>>> Regards
>>>
>>> Vidyasagar N V
>>>
>>>
>>>
>>> From: baburaj.S [mailto:baburaj.s@onmobile.com]
>>> Sent: Friday, February 05, 2010 4:59 PM
>>>
>>> To: hive-user@hadoop.apache.org
>>> Subject: RE: Hive Installation Problem
>>>
>>>
>>>
>>> No I don't have the variable defined. Any other things that I have to check.
>>> Is this happening because I am trying for Hadoop 0.20.1
>>>
>>>
>>>
>>> Babu
>>>
>>>
>>>
>>>
>>>
>>> From: Carl Steinbach [mailto:carl@cloudera.com]
>>> Sent: Friday, February 05, 2010 3:07 PM
>>> To: hive-user@hadoop.apache.org
>>> Subject: Re: Hive Installation Problem
>>>
>>>
>>>
>>> Hi Babu,
>>>
>>> ~/.ant/cache is the default Ivy cache directory for Hive, but if the
>>> environment variable IVY_HOME
>>> is set it will use $IVY_HOME/cache instead. Is it possible that you have
>>> this environment
>>> variable set to a value different than ~/.ant?
>>>
>>> On Fri, Feb 5, 2010 at 12:09 AM, baburaj.S <ba...@onmobile.com> wrote:
>>>
>>> I have tried the same but still the installation is giving the same error. I
>>> don't know if it is looking in the cache . Can we make any change in
>>> ivysettings.xml that it has to resolve the file from the file system rather
>>> through an url.
>>>
>>> Babu
>>>
>>> -----Original Message-----
>>> From: Zheng Shao [mailto:zshao9@gmail.com]
>>> Sent: Friday, February 05, 2010 12:47 PM
>>> To: hive-user@hadoop.apache.org
>>> Subject: Re: Hive Installation Problem
>>>
>>> Added to http://wiki.apache.org/hadoop/Hive/FAQ
>>>
>>> Zheng
>>>
>>> On Thu, Feb 4, 2010 at 11:11 PM, Zheng Shao <zs...@gmail.com> wrote:
>>>> Try this:
>>>>
>>>> cd ~/.ant/cache/hadoop/core/sources
>>>> wget
>>>> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>>
>>>>
>>>> Zheng
>>>>
>>>> On Thu, Feb 4, 2010 at 10:23 PM, baburaj.S <ba...@onmobile.com> wrote:
>>>>> Hello ,
>>>>>
>>>>> I am new to Hadoop and is trying to install Hive now. We have the
>>>>> following setup at our side
>>>>>
>>>>> OS - Ubuntu 9.10
>>>>> Hadoop - 0.20.1
>>>>> Hive installation tried - 0.4.0 .
>>>>>
>>>>> The Hadoop is installed and is working fine . Now when we were installing
>>>>> Hive I got error that it couldn't resolve the dependencies. I changed the
>>>>> shims build and properties xml to make the dependencies look for Hadoop
>>>>> 0.20.1 . But now when I call the ant script I get the following error
>>>>>
>>>>> ivy-retrieve-hadoop-source:
>>>>> [ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 ::
>>>>> http://ant.apache.org/ivy/ :
>>>>> :: loading settings :: file = /master/hive/ivy/ivysettings.xml
>>>>> [ivy:retrieve] :: resolving dependencies ::
>>>>> org.apache.hadoop.hive#shims;working
>>>>> [ivy:retrieve]  confs: [default]
>>>>> [ivy:retrieve] :: resolution report :: resolve 953885ms :: artifacts dl
>>>>> 0ms
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>>       |                  |            modules            ||   artifacts
>>>>>  |
>>>>>       |       conf       | number| search|dwnlded|evicted||
>>>>> number|dwnlded|
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>>       |      default     |   1   |   0   |   0   |   0   ||   0   |   0
>>>>>  |
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> [ivy:retrieve]
>>>>> [ivy:retrieve] :: problems summary ::
>>>>> [ivy:retrieve] :::: WARNINGS
>>>>> [ivy:retrieve]          module not found: hadoop#core;0.20.1
>>>>> [ivy:retrieve]  ==== hadoop-source: tried
>>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>>> [ivy:retrieve]
>>>>> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>>> [ivy:retrieve]  ==== apache-snapshot: tried
>>>>> [ivy:retrieve]
>>>>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>>> [ivy:retrieve]
>>>>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>>>>> [ivy:retrieve]  ==== maven2: tried
>>>>> [ivy:retrieve]
>>>>> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>>> [ivy:retrieve]
>>>>> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>>> [ivy:retrieve]          ::          UNRESOLVED DEPENDENCIES         ::
>>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>>> [ivy:retrieve]          :: hadoop#core;0.20.1: not found
>>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>>> [ivy:retrieve] :::: ERRORS
>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>> url=http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>> url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>> url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>> url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>> url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>>>>> [ivy:retrieve]
>>>>> [ivy:retrieve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
>>>>>
>>>>> BUILD FAILED
>>>>> /master/hive/build.xml:148: The following error occurred while executing
>>>>> this line:
>>>>> /master/hive/build.xml:93: The following error occurred while executing
>>>>> this line:
>>>>> /master/hive/shims/build.xml:64: The following error occurred while
>>>>> executing this line:
>>>>> /master/hive/build-common.xml:172: impossible to resolve dependencies:
>>>>>       resolve failed - see output for details
>>>>>
>>>>> Total time: 15 minutes 55 seconds
>>>>>
>>>>>
>>>>> I have even tried to download hadoop-0.20.1.tar.gz and put it in the ant
>>>>> cache of the user . Still the same error is repeated. I am stuck and not
>>>>> able to install it .
>>>>>
>>>>> Any help on the above will be greatly appreciated.
>>>>>
>>>>> Babu
>>>>>
>>>>>
>>>>> DISCLAIMER: The information in this message is confidential and may be
>>>>> legally privileged. It is intended solely for the addressee. Access to this
>>>>> message by anyone else is unauthorized. If you are not the intended
>>>>> recipient, any disclosure, copying, or distribution of the message, or any
>>>>> action or omission taken by you in reliance on it, is prohibited and may be
>>>>> unlawful. Please immediately contact the sender if you have received this
>>>>> message in error. Further, this e-mail may contain viruses and all
>>>>> reasonable precaution to minimize the risk arising there from is taken by
>>>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>>>> of any virus in this e-mail. All applicable virus checks should be carried
>>>>> out by you before opening this e-mail or any attachment thereto.
>>>>> Thank you - OnMobile Global Limited.
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Yours,
>>>> Zheng
>>>>
>>>
>>>
>>>
>>> --
>>> Yours,
>>> Zheng
>>>
>>> DISCLAIMER: The information in this message is confidential and may be
>>> legally privileged. It is intended solely for the addressee. Access to this
>>> message by anyone else is unauthorized. If you are not the intended
>>> recipient, any disclosure, copying, or distribution of the message, or any
>>> action or omission taken by you in reliance on it, is prohibited and may be
>>> unlawful. Please immediately contact the sender if you have received this
>>> message in error. Further, this e-mail may contain viruses and all
>>> reasonable precaution to minimize the risk arising there from is taken by
>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>> of any virus in this e-mail. All applicable virus checks should be carried
>>> out by you before opening this e-mail or any attachment thereto.
>>> Thank you - OnMobile Global Limited.
>>>
>>>
>>>
>>>
>>>
>>> ________________________________
>>>
>>> DISCLAIMER: The information in this message is confidential and may be
>>> legally privileged. It is intended solely for the addressee. Access to this
>>> message by anyone else is unauthorized. If you are not the intended
>>> recipient, any disclosure, copying, or distribution of the message, or any
>>> action or omission taken by you in reliance on it, is prohibited and may be
>>> unlawful. Please immediately contact the sender if you have received this
>>> message in error. Further, this e-mail may contain viruses and all
>>> reasonable precaution to minimize the risk arising there from is taken by
>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>> of any virus in this e-mail. All applicable virus checks should be carried
>>> out by you before opening this e-mail or any attachment thereto.
>>> Thank you - OnMobile Global Limited.
>>>
>>> ________________________________
>>> DISCLAIMER: The information in this message is confidential and may be
>>> legally privileged. It is intended solely for the addressee. Access to this
>>> message by anyone else is unauthorized. If you are not the intended
>>> recipient, any disclosure, copying, or distribution of the message, or any
>>> action or omission taken by you in reliance on it, is prohibited and may be
>>> unlawful. Please immediately contact the sender if you have received this
>>> message in error. Further, this e-mail may contain viruses and all
>>> reasonable precaution to minimize the risk arising there from is taken by
>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>> of any virus in this e-mail. All applicable virus checks should be carried
>>> out by you before opening this e-mail or any attachment thereto.
>>> Thank you - OnMobile Global Limited.
>>>
>>
>>
>>
>> --
>> Yours,
>> Zheng
>
>
> DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
> Thank you - OnMobile Global Limited.
>

Not that this solves your problem, but Have you considered using a
release instead of building from trunk?

DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
Thank you - OnMobile Global Limited.

Re: Hive Installation Problem

Posted by Edward Capriolo <ed...@gmail.com>.
On Sun, Feb 7, 2010 at 8:51 AM, Vidyasagar Venkata Nallapati
<vi...@onmobile.com> wrote:
> Hi,
>
> I am getting the problem while downloading the .pom file from
>
> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.0/core-0.20.0.pom
> ERROR: Server access Error: Connection timed out url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.0/core-0.20.0.pom
>
> Is this reporsitory changed?
>
> Regards
> Vidyasagar N V
>
> -----Original Message-----
> From: John Sichi [mailto:jsichi@facebook.com]
> Sent: Saturday, February 06, 2010 7:39 AM
> To: hive-user@hadoop.apache.org
> Subject: Re: Hive Installation Problem
>
> By the way, the current IVY_HOME detection in build-common.xml is broken because it doesn't do:
>
> <property environment="env"/>
>
> first.
>
> I'll log a JIRA issue for it, but it seems there are other problems with it even after I fix that since the build is currenlty installing ivy under build/ivy rather than under ${ivy.home}; nothing else in build-common.xml references ivy.home.
>
> JVS
>
> On Feb 5, 2010, at 1:15 PM, Zheng Shao wrote:
>
>> HI guys,
>>
>> Can you have a try to make the following directory the same as mine?
>> Once this is done, remove the "build" directory, and run "ant package".
>>
>> Does this solve the problem?
>>
>>
>>
>> [zshao@dev ~/.ant] ls -lR
>> .:
>> total 3896
>> drwxr-xr-x  2 zshao users    4096 Feb  5 13:04 apache-ivy-2.0.0-rc2
>> -rw-r--r--  1 zshao users 3965953 Nov  4  2008 apache-ivy-2.0.0-rc2-bin.zip
>> -rw-r--r--  1 zshao users       0 Feb  5 13:04 apache-ivy-2.0.0-rc2.installed
>> drwxr-xr-x  3 zshao users    4096 Feb  5 13:07 cache
>> drwxr-xr-x  2 zshao users    4096 Feb  5 13:04 lib
>>
>> ./apache-ivy-2.0.0-rc2:
>> total 880
>> -rw-r--r--  1 zshao users 893199 Oct 28  2008 ivy-2.0.0-rc2.jar
>>
>> ./cache:
>> total 4
>> drwxr-xr-x  3 zshao users 4096 Feb  4 19:30 hadoop
>>
>> ./cache/hadoop:
>> total 4
>> drwxr-xr-x  3 zshao users 4096 Feb  5 13:08 core
>>
>> ./cache/hadoop/core:
>> total 4
>> drwxr-xr-x  2 zshao users 4096 Feb  4 19:30 sources
>>
>> ./cache/hadoop/core/sources:
>> total 127436
>> -rw-r--r--  1 zshao users 14427013 Aug 20  2008 hadoop-0.17.2.1.tar.gz
>> -rw-r--r--  1 zshao users 30705253 Jan 22  2009 hadoop-0.18.3.tar.gz
>> -rw-r--r--  1 zshao users 42266180 Nov 13  2008 hadoop-0.19.0.tar.gz
>> -rw-r--r--  1 zshao users 42813980 Apr  8  2009 hadoop-0.20.0.tar.gz
>>
>> ./lib:
>> total 880
>> -rw-r--r--  1 zshao users 893199 Feb  5 13:04 ivy-2.0.0-rc2.jar
>>
>>
>> Zheng
>>
>> On Fri, Feb 5, 2010 at 5:49 AM, Vidyasagar Venkata Nallapati
>> <vi...@onmobile.com> wrote:
>>> Hi ,
>>>
>>>
>>>
>>> We are still getting the problem
>>>
>>>
>>>
>>> [ivy:retrieve] no resolved descriptor found: launching default resolve
>>>
>>> Overriding previous definition of property "ivy.version"
>>>
>>> [ivy:retrieve] using ivy parser to parse
>>> file:/master/hadoop/hive/shims/ivy.xml
>>>
>>> [ivy:retrieve] :: resolving dependencies ::
>>> org.apache.hadoop.hive#shims;working@ph1
>>>
>>> [ivy:retrieve]  confs: [default]
>>>
>>> [ivy:retrieve]  validate = true
>>>
>>> [ivy:retrieve]  refresh = false
>>>
>>> [ivy:retrieve] resolving dependencies for configuration 'default'
>>>
>>> [ivy:retrieve] == resolving dependencies for
>>> org.apache.hadoop.hive#shims;working@ph1 [default]
>>>
>>> [ivy:retrieve] == resolving dependencies
>>> org.apache.hadoop.hive#shims;working@ph1->hadoop#core;0.20.1 [default->*]
>>>
>>> [ivy:retrieve] default: Checking cache for: dependency: hadoop#core;0.20.1
>>> {*=[*]}
>>>
>>> [ivy:retrieve]  hadoop-source: no ivy file nor artifact found for
>>> hadoop#core;0.20.1
>>>
>>> [ivy:retrieve]          tried
>>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>>
>>>
>>>
>>> And the .pom for this is not getting copied, please suggest something on
>>> this.
>>>
>>>
>>>
>>> Regards
>>>
>>> Vidyasagar N V
>>>
>>>
>>>
>>> From: baburaj.S [mailto:baburaj.s@onmobile.com]
>>> Sent: Friday, February 05, 2010 4:59 PM
>>>
>>> To: hive-user@hadoop.apache.org
>>> Subject: RE: Hive Installation Problem
>>>
>>>
>>>
>>> No I don't have the variable defined. Any other things that I have to check.
>>> Is this happening because I am trying for Hadoop 0.20.1
>>>
>>>
>>>
>>> Babu
>>>
>>>
>>>
>>>
>>>
>>> From: Carl Steinbach [mailto:carl@cloudera.com]
>>> Sent: Friday, February 05, 2010 3:07 PM
>>> To: hive-user@hadoop.apache.org
>>> Subject: Re: Hive Installation Problem
>>>
>>>
>>>
>>> Hi Babu,
>>>
>>> ~/.ant/cache is the default Ivy cache directory for Hive, but if the
>>> environment variable IVY_HOME
>>> is set it will use $IVY_HOME/cache instead. Is it possible that you have
>>> this environment
>>> variable set to a value different than ~/.ant?
>>>
>>> On Fri, Feb 5, 2010 at 12:09 AM, baburaj.S <ba...@onmobile.com> wrote:
>>>
>>> I have tried the same but still the installation is giving the same error. I
>>> don't know if it is looking in the cache . Can we make any change in
>>> ivysettings.xml that it has to resolve the file from the file system rather
>>> through an url.
>>>
>>> Babu
>>>
>>> -----Original Message-----
>>> From: Zheng Shao [mailto:zshao9@gmail.com]
>>> Sent: Friday, February 05, 2010 12:47 PM
>>> To: hive-user@hadoop.apache.org
>>> Subject: Re: Hive Installation Problem
>>>
>>> Added to http://wiki.apache.org/hadoop/Hive/FAQ
>>>
>>> Zheng
>>>
>>> On Thu, Feb 4, 2010 at 11:11 PM, Zheng Shao <zs...@gmail.com> wrote:
>>>> Try this:
>>>>
>>>> cd ~/.ant/cache/hadoop/core/sources
>>>> wget
>>>> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>>
>>>>
>>>> Zheng
>>>>
>>>> On Thu, Feb 4, 2010 at 10:23 PM, baburaj.S <ba...@onmobile.com> wrote:
>>>>> Hello ,
>>>>>
>>>>> I am new to Hadoop and is trying to install Hive now. We have the
>>>>> following setup at our side
>>>>>
>>>>> OS - Ubuntu 9.10
>>>>> Hadoop - 0.20.1
>>>>> Hive installation tried - 0.4.0 .
>>>>>
>>>>> The Hadoop is installed and is working fine . Now when we were installing
>>>>> Hive I got error that it couldn't resolve the dependencies. I changed the
>>>>> shims build and properties xml to make the dependencies look for Hadoop
>>>>> 0.20.1 . But now when I call the ant script I get the following error
>>>>>
>>>>> ivy-retrieve-hadoop-source:
>>>>> [ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 ::
>>>>> http://ant.apache.org/ivy/ :
>>>>> :: loading settings :: file = /master/hive/ivy/ivysettings.xml
>>>>> [ivy:retrieve] :: resolving dependencies ::
>>>>> org.apache.hadoop.hive#shims;working
>>>>> [ivy:retrieve]  confs: [default]
>>>>> [ivy:retrieve] :: resolution report :: resolve 953885ms :: artifacts dl
>>>>> 0ms
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>>       |                  |            modules            ||   artifacts
>>>>>  |
>>>>>       |       conf       | number| search|dwnlded|evicted||
>>>>> number|dwnlded|
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>>       |      default     |   1   |   0   |   0   |   0   ||   0   |   0
>>>>>  |
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> [ivy:retrieve]
>>>>> [ivy:retrieve] :: problems summary ::
>>>>> [ivy:retrieve] :::: WARNINGS
>>>>> [ivy:retrieve]          module not found: hadoop#core;0.20.1
>>>>> [ivy:retrieve]  ==== hadoop-source: tried
>>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>>> [ivy:retrieve]
>>>>> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>>> [ivy:retrieve]  ==== apache-snapshot: tried
>>>>> [ivy:retrieve]
>>>>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>>> [ivy:retrieve]
>>>>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>>>>> [ivy:retrieve]  ==== maven2: tried
>>>>> [ivy:retrieve]
>>>>> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>>> [ivy:retrieve]
>>>>> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>>> [ivy:retrieve]          ::          UNRESOLVED DEPENDENCIES         ::
>>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>>> [ivy:retrieve]          :: hadoop#core;0.20.1: not found
>>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>>> [ivy:retrieve] :::: ERRORS
>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>> url=http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>> url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>> url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>> url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>>> url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>>>>> [ivy:retrieve]
>>>>> [ivy:retrieve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
>>>>>
>>>>> BUILD FAILED
>>>>> /master/hive/build.xml:148: The following error occurred while executing
>>>>> this line:
>>>>> /master/hive/build.xml:93: The following error occurred while executing
>>>>> this line:
>>>>> /master/hive/shims/build.xml:64: The following error occurred while
>>>>> executing this line:
>>>>> /master/hive/build-common.xml:172: impossible to resolve dependencies:
>>>>>       resolve failed - see output for details
>>>>>
>>>>> Total time: 15 minutes 55 seconds
>>>>>
>>>>>
>>>>> I have even tried to download hadoop-0.20.1.tar.gz and put it in the ant
>>>>> cache of the user . Still the same error is repeated. I am stuck and not
>>>>> able to install it .
>>>>>
>>>>> Any help on the above will be greatly appreciated.
>>>>>
>>>>> Babu
>>>>>
>>>>>
>>>>> DISCLAIMER: The information in this message is confidential and may be
>>>>> legally privileged. It is intended solely for the addressee. Access to this
>>>>> message by anyone else is unauthorized. If you are not the intended
>>>>> recipient, any disclosure, copying, or distribution of the message, or any
>>>>> action or omission taken by you in reliance on it, is prohibited and may be
>>>>> unlawful. Please immediately contact the sender if you have received this
>>>>> message in error. Further, this e-mail may contain viruses and all
>>>>> reasonable precaution to minimize the risk arising there from is taken by
>>>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>>>> of any virus in this e-mail. All applicable virus checks should be carried
>>>>> out by you before opening this e-mail or any attachment thereto.
>>>>> Thank you - OnMobile Global Limited.
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Yours,
>>>> Zheng
>>>>
>>>
>>>
>>>
>>> --
>>> Yours,
>>> Zheng
>>>
>>> DISCLAIMER: The information in this message is confidential and may be
>>> legally privileged. It is intended solely for the addressee. Access to this
>>> message by anyone else is unauthorized. If you are not the intended
>>> recipient, any disclosure, copying, or distribution of the message, or any
>>> action or omission taken by you in reliance on it, is prohibited and may be
>>> unlawful. Please immediately contact the sender if you have received this
>>> message in error. Further, this e-mail may contain viruses and all
>>> reasonable precaution to minimize the risk arising there from is taken by
>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>> of any virus in this e-mail. All applicable virus checks should be carried
>>> out by you before opening this e-mail or any attachment thereto.
>>> Thank you - OnMobile Global Limited.
>>>
>>>
>>>
>>>
>>>
>>> ________________________________
>>>
>>> DISCLAIMER: The information in this message is confidential and may be
>>> legally privileged. It is intended solely for the addressee. Access to this
>>> message by anyone else is unauthorized. If you are not the intended
>>> recipient, any disclosure, copying, or distribution of the message, or any
>>> action or omission taken by you in reliance on it, is prohibited and may be
>>> unlawful. Please immediately contact the sender if you have received this
>>> message in error. Further, this e-mail may contain viruses and all
>>> reasonable precaution to minimize the risk arising there from is taken by
>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>> of any virus in this e-mail. All applicable virus checks should be carried
>>> out by you before opening this e-mail or any attachment thereto.
>>> Thank you - OnMobile Global Limited.
>>>
>>> ________________________________
>>> DISCLAIMER: The information in this message is confidential and may be
>>> legally privileged. It is intended solely for the addressee. Access to this
>>> message by anyone else is unauthorized. If you are not the intended
>>> recipient, any disclosure, copying, or distribution of the message, or any
>>> action or omission taken by you in reliance on it, is prohibited and may be
>>> unlawful. Please immediately contact the sender if you have received this
>>> message in error. Further, this e-mail may contain viruses and all
>>> reasonable precaution to minimize the risk arising there from is taken by
>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>> of any virus in this e-mail. All applicable virus checks should be carried
>>> out by you before opening this e-mail or any attachment thereto.
>>> Thank you - OnMobile Global Limited.
>>>
>>
>>
>>
>> --
>> Yours,
>> Zheng
>
>
> DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
> Thank you - OnMobile Global Limited.
>

Not that this solves your problem, but Have you considered using a
release instead of building from trunk?

RE: Hive Installation Problem

Posted by "baburaj.S" <ba...@onmobile.com>.
Getting the following problem now

Apache Ant version 1.7.1 compiled on October 19 2009
Buildfile: build.xml
Detected Java version: 1.6 in: /usr/lib/jvm/java-6-sun-1.6.0.15/jre
Detected OS: Linux
parsing buildfile /master/hadoop/hive/build.xml with URI = file:/master/hadoop/hive/build.xml
Project base dir set to: /master/hadoop/hive
[antlib:org.apache.tools.ant] Could not load definitions from resource org/apache/tools/ant/antlib.xml. It could not be found.
 [property] Loading /master/hadoop/hive/build.properties
 [property] Loading /home/phoenix/build.properties
 [property] Unable to find property file: /home/phoenix/build.properties
 [property] Loading /master/hadoop/hive/build.properties
Override ignored for property "jetty.test.jar"
Override ignored for property "javac.version"
Override ignored for property "hadoop.root.default"
Override ignored for property "version"
Override ignored for property "Name"
Override ignored for property "hadoop.jar"
Override ignored for property "hadoop.version.ant-internal"
Override ignored for property "javac.debug"
Override ignored for property "common.jar"
Override ignored for property "hadoop.version"
Override ignored for property "build.dir.hadoop"
Override ignored for property "build.dir.hive"
Override ignored for property "hadoop.root"
Override ignored for property "jasper.test.jar"
Override ignored for property "javac.optimize"
Override ignored for property "hadoop.test.jar"
Override ignored for property "year"
Override ignored for property "servlet.test.jar"
Override ignored for property "jasperc.test.jar"
Override ignored for property "hadoop.mirror"
Override ignored for property "javac.args"
Override ignored for property "jsp.test.jar"
Override ignored for property "javac.args.warnings"
Override ignored for property "javac.deprecation"
Override ignored for property "name"
 [macrodef] creating macro  macro_tar
 [macrodef] creating macro  iterate-cpp
 [macrodef] creating macro  iterate-all
 [macrodef] creating macro  iterate
Importing file /master/hadoop/hive/build-common.xml from /master/hadoop/hive/build.xml
parsing buildfile /master/hadoop/hive/build-common.xml with URI = file:/master/hadoop/hive/build-common.xml
Already defined in main or a previous import, ignore init
Already defined in main or a previous import, ignore test-init
Already defined in main or a previous import, ignore compile
Already defined in main or a previous import, ignore jar
Already defined in main or a previous import, ignore test
Already defined in main or a previous import, ignore clean-test
Already defined in main or a previous import, ignore clean
Override ignored for property "hive.root"
 [property] Loading /master/hadoop/hive/build.properties
Override ignored for property "jetty.test.jar"
Override ignored for property "javac.version"
Override ignored for property "hadoop.root.default"
Override ignored for property "version"
Override ignored for property "Name"
Override ignored for property "hadoop.jar"
Override ignored for property "hadoop.version.ant-internal"
Override ignored for property "javac.debug"
Override ignored for property "common.jar"
Override ignored for property "hadoop.version"
Override ignored for property "build.dir.hadoop"
Override ignored for property "build.dir.hive"
Override ignored for property "hadoop.root"
Override ignored for property "jasper.test.jar"
Override ignored for property "javac.optimize"
Override ignored for property "hadoop.test.jar"
Override ignored for property "year"
Override ignored for property "servlet.test.jar"
Override ignored for property "jasperc.test.jar"
Override ignored for property "hadoop.mirror"
Override ignored for property "javac.args"
Override ignored for property "jsp.test.jar"
Override ignored for property "javac.args.warnings"
Override ignored for property "javac.deprecation"
Override ignored for property "name"
 [property] Loading /home/phoenix/build.properties
 [property] Unable to find property file: /home/phoenix/build.properties
 [property] Loading /master/hadoop/hive/build.properties
Override ignored for property "jetty.test.jar"
Override ignored for property "javac.version"
Override ignored for property "hadoop.root.default"
Override ignored for property "version"
Override ignored for property "Name"
Override ignored for property "hadoop.jar"
Override ignored for property "hadoop.version.ant-internal"
Override ignored for property "javac.debug"
Override ignored for property "common.jar"
Override ignored for property "hadoop.version"
Override ignored for property "build.dir.hadoop"
Override ignored for property "build.dir.hive"
Override ignored for property "hadoop.root"
Override ignored for property "jasper.test.jar"
Override ignored for property "javac.optimize"
Override ignored for property "hadoop.test.jar"
Override ignored for property "year"
Override ignored for property "servlet.test.jar"
Override ignored for property "jasperc.test.jar"
Override ignored for property "hadoop.mirror"
Override ignored for property "javac.args"
Override ignored for property "jsp.test.jar"
Override ignored for property "javac.args.warnings"
Override ignored for property "javac.deprecation"
Override ignored for property "name"
Override ignored for property "build.dir.hive"
Override ignored for property "build.dir.hadoop"
Override ignored for property "test.build.dir"
Override ignored for property "test.data.dir"
Property "env.IVY_HOME" has not been set
[available] Unable to find build/hadoopcore/hadoop-0.20.1.installed to set property hadoopcore.0.20.1.install.done
Overriding previous definition of reference to common-classpath
Overriding previous definition of reference to classpath
Build sequence for target(s) `package' is [jar, package]
Complete build sequence is [jar, package, ivy-init-dirs, javadoc, hivecommon.test-conditions, test-init, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-resolve-checkstyle, ivy-retrieve-checkstyle, check-for-checkstyle, checkstyle, ivy-resolve, ivy-retrieve, create-dirs, hivecommon.install-hadoopcore-default, hivecommon.ivy-init-antlib, init, compile-cpp, hivecommon.ivy-init, clean-eclipse-files, clean-test, test, hivecommon.ivy-retrieve, setup, test-conditions, gen-test, compile, compile-test, test-jar, hivecommon.test, hivecommon.ivy-retrieve-hadoop-source, hivecommon.clean-test, docs, binary, testreport, hivecommon.clean, compile-ant-tasks, deploy-ant-tasks, hivecommon.ivy-init-dirs, hivecommon.deploy-ant-tasks, ivy-retrieve-hadoop-source, hivecommon.install-hadoopcore-internal, compile-cpp-clean, gen-testdata, install-hadoopcore-default, install-hadoopcore, hivecommon.compile, hivecommon.ivy-resolve-checkstyle, hivecommon.gen-test, install-hadoopcore-internal, package-cpp, hivecommon.ivy-probe-antlib, hivecommon.ivy-resolve, hivecommon.test-init, tar, hivecommon.jar, clean, hivecommon.compile-ant-tasks, hivecommon.compile-test, hivecommon.setup, hivecommon.ivy-retrieve-checkstyle, eclipse-files, hivecommon.install-hadoopcore, hivecommon.test-jar, hivecommon.init, hivecommon.create-dirs, changes-to-html, ]

jar:
Detected Java version: 1.6 in: /usr/lib/jvm/java-6-sun-1.6.0.15/jre
Detected OS: Linux
   [subant] calling target(s) [jar] in build file /master/hadoop/hive/shims/build.xml
parsing buildfile /master/hadoop/hive/shims/build.xml with URI = file:/master/hadoop/hive/shims/build.xml
Project base dir set to: /master/hadoop/hive/shims
Importing file /master/hadoop/hive/build-common.xml from /master/hadoop/hive/shims/build.xml
parsing buildfile /master/hadoop/hive/build-common.xml with URI = file:/master/hadoop/hive/build-common.xml
Already defined in main or a previous import, ignore compile
Already defined in main or a previous import, ignore test
 [property] Loading /master/hadoop/hive/build.properties
Override ignored for property "build.dir.hive"
 [property] Loading /home/phoenix/build.properties
 [property] Unable to find property file: /home/phoenix/build.properties
 [property] Loading /master/hadoop/hive/shims/build.properties
 [property] Unable to find property file: /master/hadoop/hive/shims/build.properties
Override ignored for property "build.dir.hive"
Override ignored for property "build.dir.hadoop"
Property "env.IVY_HOME" has not been set
[available] Unable to find /master/hadoop/hive/build/hadoopcore/hadoop-0.20.1.installed to set property hadoopcore.0.20.1.install.done
Overriding previous definition of reference to classpath
   [subant] Entering /master/hadoop/hive/shims/build.xml...
Build sequence for target(s) `jar' is [create-dirs, compile-ant-tasks, deploy-ant-tasks, init, compile, jar]
Complete build sequence is [create-dirs, compile-ant-tasks, deploy-ant-tasks, init, compile, jar, ivy-init-dirs, hivecommon.test-conditions, test-init, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-resolve, ivy-retrieve, hivecommon.install-hadoopcore-default, hivecommon.ivy-init-antlib, hivecommon.ivy-init, test, hivecommon.ivy-retrieve, setup, test-conditions, gen-test, compile-test, test-jar, hivecommon.test, hivecommon.ivy-retrieve-hadoop-source, hivecommon.clean-test, hivecommon.clean, hivecommon.ivy-init-dirs, clean-test, hivecommon.deploy-ant-tasks, ivy-retrieve-hadoop-source, hivecommon.install-hadoopcore-internal, install-hadoopcore-default, install-hadoopcore, hivecommon.compile, hivecommon.ivy-resolve-checkstyle, hivecommon.gen-test, install-hadoopcore-internal, hivecommon.ivy-probe-antlib, hivecommon.ivy-resolve, hivecommon.test-init, clean, hivecommon.jar, hivecommon.compile-ant-tasks, hivecommon.compile-test, hivecommon.setup, ivy-resolve-checkstyle, hivecommon.ivy-retrieve-checkstyle, ivy-retrieve-checkstyle, build_shims, hivecommon.install-hadoopcore, hivecommon.test-jar, hivecommon.init, hivecommon.create-dirs, ]

create-dirs:
    [mkdir] Skipping /master/hadoop/hive/build because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/shims because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/shims/classes because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/jexl/classes because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/hadoopcore because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/shims/test because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/shims/test/src because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/shims/test/classes because it already exists.

compile-ant-tasks:
Detected Java version: 1.6 in: /usr/lib/jvm/java-6-sun-1.6.0.15/jre
Detected OS: Linux
   [subant] calling target(s) [compile] in build file /master/hadoop/hive/ant/build.xml
parsing buildfile /master/hadoop/hive/ant/build.xml with URI = file:/master/hadoop/hive/ant/build.xml
Project base dir set to: /master/hadoop/hive/ant
Importing file /master/hadoop/hive/build-common.xml from /master/hadoop/hive/ant/build.xml
parsing buildfile /master/hadoop/hive/build-common.xml with URI = file:/master/hadoop/hive/build-common.xml
Already defined in main or a previous import, ignore init
Already defined in main or a previous import, ignore compile
Already defined in main or a previous import, ignore jar
 [property] Loading /master/hadoop/hive/build.properties
Override ignored for property "build.dir.hive"
 [property] Loading /home/phoenix/build.properties
 [property] Unable to find property file: /home/phoenix/build.properties
 [property] Loading /master/hadoop/hive/ant/build.properties
 [property] Unable to find property file: /master/hadoop/hive/ant/build.properties
Override ignored for property "build.dir.hive"
Override ignored for property "build.dir.hadoop"
Property "env.IVY_HOME" has not been set
[available] Unable to find /master/hadoop/hive/build/hadoopcore/hadoop-0.20.1.installed to set property hadoopcore.0.20.1.install.done
   [subant] Entering /master/hadoop/hive/ant/build.xml...
Build sequence for target(s) `compile' is [create-dirs, init, compile]
Complete build sequence is [create-dirs, init, compile, ivy-init-dirs, hivecommon.test-conditions, test-init, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-resolve, ivy-retrieve, hivecommon.install-hadoopcore-default, hivecommon.ivy-init-antlib, hivecommon.ivy-init, test-conditions, gen-test, compile-test, test-jar, test, hivecommon.ivy-retrieve, setup, hivecommon.test, hivecommon.ivy-retrieve-hadoop-source, hivecommon.clean-test, hivecommon.clean, compile-ant-tasks, deploy-ant-tasks, hivecommon.ivy-init-dirs, clean-test, hivecommon.deploy-ant-tasks, ivy-retrieve-hadoop-source, hivecommon.install-hadoopcore-internal, install-hadoopcore-default, install-hadoopcore, hivecommon.compile, hivecommon.ivy-resolve-checkstyle, hivecommon.gen-test, install-hadoopcore-internal, hivecommon.ivy-probe-antlib, hivecommon.ivy-resolve, hivecommon.test-init, clean, hivecommon.jar, hivecommon.compile-ant-tasks, hivecommon.compile-test, ivy-resolve-checkstyle, hivecommon.ivy-retrieve-checkstyle, hivecommon.setup, ivy-retrieve-checkstyle, jar, hivecommon.install-hadoopcore, hivecommon.test-jar, hivecommon.init, hivecommon.create-dirs, ]

create-dirs:
    [mkdir] Skipping /master/hadoop/hive/build because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks/classes because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/jexl/classes because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/hadoopcore because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks/test because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks/test/src because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks/test/classes because it already exists.

init:

compile:
     [echo] Compiling: anttasks
    [javac] org/apache/hadoop/hive/ant/GetVersionPref.java omitted as /master/hadoop/hive/build/anttasks/classes/org/apache/hadoop/hive/ant/GetVersionPref.class is up to date.
    [javac] org/apache/hadoop/hive/ant/QTestGenTask.java omitted as /master/hadoop/hive/build/anttasks/classes/org/apache/hadoop/hive/ant/QTestGenTask.class is up to date.
   [subant] Exiting /master/hadoop/hive/ant/build.xml.

deploy-ant-tasks:
Detected Java version: 1.6 in: /usr/lib/jvm/java-6-sun-1.6.0.15/jre
Detected OS: Linux
   [subant] calling target(s) [jar] in build file /master/hadoop/hive/ant/build.xml
parsing buildfile /master/hadoop/hive/ant/build.xml with URI = file:/master/hadoop/hive/ant/build.xml
Project base dir set to: /master/hadoop/hive/ant
Importing file /master/hadoop/hive/build-common.xml from /master/hadoop/hive/ant/build.xml
parsing buildfile /master/hadoop/hive/build-common.xml with URI = file:/master/hadoop/hive/build-common.xml
Already defined in main or a previous import, ignore init
Already defined in main or a previous import, ignore compile
Already defined in main or a previous import, ignore jar
 [property] Loading /master/hadoop/hive/build.properties
Override ignored for property "build.dir.hive"
 [property] Loading /home/phoenix/build.properties
 [property] Unable to find property file: /home/phoenix/build.properties
 [property] Loading /master/hadoop/hive/ant/build.properties
 [property] Unable to find property file: /master/hadoop/hive/ant/build.properties
Override ignored for property "build.dir.hive"
Override ignored for property "build.dir.hadoop"
Property "env.IVY_HOME" has not been set
[available] Unable to find /master/hadoop/hive/build/hadoopcore/hadoop-0.20.1.installed to set property hadoopcore.0.20.1.install.done
   [subant] Entering /master/hadoop/hive/ant/build.xml...
Build sequence for target(s) `jar' is [create-dirs, init, compile, jar]
Complete build sequence is [create-dirs, init, compile, jar, ivy-init-dirs, hivecommon.test-conditions, test-init, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-resolve, ivy-retrieve, hivecommon.install-hadoopcore-default, hivecommon.ivy-init-antlib, hivecommon.ivy-init, test-conditions, gen-test, compile-test, test-jar, test, hivecommon.ivy-retrieve, setup, hivecommon.test, hivecommon.ivy-retrieve-hadoop-source, hivecommon.clean-test, hivecommon.clean, compile-ant-tasks, deploy-ant-tasks, hivecommon.ivy-init-dirs, clean-test, hivecommon.deploy-ant-tasks, ivy-retrieve-hadoop-source, hivecommon.install-hadoopcore-internal, install-hadoopcore-default, install-hadoopcore, hivecommon.compile, hivecommon.ivy-resolve-checkstyle, hivecommon.gen-test, install-hadoopcore-internal, hivecommon.ivy-probe-antlib, hivecommon.ivy-resolve, hivecommon.test-init, clean, hivecommon.jar, hivecommon.compile-ant-tasks, hivecommon.compile-test, ivy-resolve-checkstyle, hivecommon.ivy-retrieve-checkstyle, hivecommon.setup, ivy-retrieve-checkstyle, hivecommon.install-hadoopcore, hivecommon.test-jar, hivecommon.init, hivecommon.create-dirs, ]

create-dirs:
    [mkdir] Skipping /master/hadoop/hive/build because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks/classes because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/jexl/classes because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/hadoopcore because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks/test because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks/test/src because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks/test/classes because it already exists.

init:

compile:
     [echo] Compiling: anttasks
    [javac] org/apache/hadoop/hive/ant/GetVersionPref.java omitted as /master/hadoop/hive/build/anttasks/classes/org/apache/hadoop/hive/ant/GetVersionPref.class is up to date.
    [javac] org/apache/hadoop/hive/ant/QTestGenTask.java omitted as /master/hadoop/hive/build/anttasks/classes/org/apache/hadoop/hive/ant/QTestGenTask.class is up to date.

jar:
     [copy] /master/hadoop/hive/ant/src/org/apache/hadoop/hive/ant/antlib.xml omitted as /master/hadoop/hive/build/anttasks/classes/org/apache/hadoop/hive/ant/antlib.xml is up to date.
      [jar] org omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/ is up to date.
      [jar] org/apache omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/apache/ is up to date.
      [jar] org/apache/hadoop omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/apache/hadoop/ is up to date.
      [jar] org/apache/hadoop/hive omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/apache/hadoop/hive/ is up to date.
      [jar] org/apache/hadoop/hive/ant omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/apache/hadoop/hive/ant/ is up to date.
      [jar] org/apache/hadoop/hive/ant/GetVersionPref.class omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/apache/hadoop/hive/ant/GetVersionPref.class is up to date.
      [jar] org/apache/hadoop/hive/ant/QTestGenTask$QFileFilter.class omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/apache/hadoop/hive/ant/QTestGenTask$QFileFilter.class is up to date.
      [jar] org/apache/hadoop/hive/ant/QTestGenTask$QFileRegexFilter.class omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/apache/hadoop/hive/ant/QTestGenTask$QFileRegexFilter.class is up to date.
      [jar] org/apache/hadoop/hive/ant/QTestGenTask.class omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/apache/hadoop/hive/ant/QTestGenTask.class is up to date.
      [jar] org/apache/hadoop/hive/ant/antlib.xml omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/apache/hadoop/hive/ant/antlib.xml is up to date.
      [jar] No Implementation-Title set.No Implementation-Version set.No Implementation-Vendor set.
      [jar] Location: /master/hadoop/hive/ant/build.xml:49:
   [subant] Exiting /master/hadoop/hive/ant/build.xml.

init:

compile:
Detected Java version: 1.6 in: /usr/lib/jvm/java-6-sun-1.6.0.15/jre
Detected OS: Linux
  [antcall] calling target(s) [build_shims] in build file /master/hadoop/hive/shims/build.xml
parsing buildfile /master/hadoop/hive/shims/build.xml with URI = file:/master/hadoop/hive/shims/build.xml
Project base dir set to: /master/hadoop/hive/shims
Importing file /master/hadoop/hive/build-common.xml from /master/hadoop/hive/shims/build.xml
parsing buildfile /master/hadoop/hive/build-common.xml with URI = file:/master/hadoop/hive/build-common.xml
Already defined in main or a previous import, ignore compile
Already defined in main or a previous import, ignore test
 [property] Loading /master/hadoop/hive/build.properties
Override ignored for property "hadoop.version.ant-internal"
Override ignored for property "build.dir.hive"
 [property] Loading /home/phoenix/build.properties
 [property] Unable to find property file: /home/phoenix/build.properties
 [property] Loading /master/hadoop/hive/shims/build.properties
 [property] Unable to find property file: /master/hadoop/hive/shims/build.properties
Override ignored for property "build.dir.hive"
Override ignored for property "build.dir.hadoop"
Property "env.IVY_HOME" has not been set
[available] Unable to find /master/hadoop/hive/build/hadoopcore/hadoop-0.20.1.installed to set property hadoopcore.0.20.1.install.done
Overriding previous definition of reference to classpath
Build sequence for target(s) `build_shims' is [ivy-init-dirs, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-retrieve-hadoop-source, install-hadoopcore-internal, build_shims]
Complete build sequence is [ivy-init-dirs, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-retrieve-hadoop-source, install-hadoopcore-internal, build_shims, hivecommon.test-conditions, test-init, ivy-resolve, ivy-retrieve, hivecommon.install-hadoopcore-default, create-dirs, hivecommon.ivy-init-antlib, hivecommon.ivy-init, test, hivecommon.ivy-retrieve, setup, test-conditions, gen-test, compile-ant-tasks, deploy-ant-tasks, init, compile, compile-test, test-jar, hivecommon.test, hivecommon.ivy-retrieve-hadoop-source, hivecommon.clean-test, hivecommon.clean, hivecommon.ivy-init-dirs, clean-test, hivecommon.deploy-ant-tasks, hivecommon.install-hadoopcore-internal, install-hadoopcore-default, install-hadoopcore, hivecommon.compile, hivecommon.ivy-resolve-checkstyle, hivecommon.gen-test, hivecommon.ivy-probe-antlib, hivecommon.ivy-resolve, hivecommon.test-init, clean, hivecommon.jar, hivecommon.compile-ant-tasks, hivecommon.compile-test, hivecommon.setup, ivy-resolve-checkstyle, hivecommon.ivy-retrieve-checkstyle, ivy-retrieve-checkstyle, jar, hivecommon.install-hadoopcore, hivecommon.test-jar, hivecommon.init, hivecommon.create-dirs, ]
  [antcall] Entering /master/hadoop/hive/shims/build.xml...
Build sequence for target(s) `build_shims' is [ivy-init-dirs, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-retrieve-hadoop-source, install-hadoopcore-internal, build_shims]
Complete build sequence is [ivy-init-dirs, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-retrieve-hadoop-source, install-hadoopcore-internal, build_shims, hivecommon.test-conditions, test-init, ivy-resolve, ivy-retrieve, hivecommon.install-hadoopcore-default, create-dirs, hivecommon.ivy-init-antlib, hivecommon.ivy-init, test, hivecommon.ivy-retrieve, setup, test-conditions, gen-test, compile-ant-tasks, deploy-ant-tasks, init, compile, compile-test, test-jar, hivecommon.test, hivecommon.ivy-retrieve-hadoop-source, hivecommon.clean-test, hivecommon.clean, hivecommon.ivy-init-dirs, clean-test, hivecommon.deploy-ant-tasks, hivecommon.install-hadoopcore-internal, install-hadoopcore-default, install-hadoopcore, hivecommon.compile, hivecommon.ivy-resolve-checkstyle, hivecommon.gen-test, hivecommon.ivy-probe-antlib, hivecommon.ivy-resolve, hivecommon.test-init, clean, hivecommon.jar, hivecommon.compile-ant-tasks, hivecommon.compile-test, hivecommon.setup, ivy-resolve-checkstyle, hivecommon.ivy-retrieve-checkstyle, ivy-retrieve-checkstyle, jar, hivecommon.install-hadoopcore, hivecommon.test-jar, hivecommon.init, hivecommon.create-dirs, ]

ivy-init-dirs:
    [mkdir] Skipping /master/hadoop/hive/build/ivy because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/ivy/lib because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/ivy/report because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/ivy/maven because it already exists.

ivy-probe-antlib:
parsing buildfile jar:file:/usr/share/ant/lib/ant.jar!/org/apache/tools/ant/types/conditions/antlib.xml with URI = jar:file:/usr/share/ant/lib/ant.jar!/org/apache/tools/ant/types/conditions/antlib.xml
[antlib:org.apache.ivy.ant] Could not load definitions from resource org/apache/ivy/ant/antlib.xml. It could not be found.

ivy-init-antlib:
dropping /master/hadoop/hive/build/ivy/lib/ivy-2.0.0-rc2.jar from path as it doesn't exist
  [typedef] Could not load definitions from resource org/apache/ivy/ant/antlib.xml. It could not be found.
  [antcall] Exiting /master/hadoop/hive/shims/build.xml.
   [subant] Exiting /master/hadoop/hive/shims/build.xml.
Apache Ant version 1.7.1 compiled on October 19 2009
Buildfile: build.xml
Detected Java version: 1.6 in: /usr/lib/jvm/java-6-sun-1.6.0.15/jre
Detected OS: Linux
parsing buildfile /master/hadoop/hive/build.xml with URI = file:/master/hadoop/hive/build.xml
Project base dir set to: /master/hadoop/hive
[antlib:org.apache.tools.ant] Could not load definitions from resource org/apache/tools/ant/antlib.xml. It could not be found.
 [property] Loading /master/hadoop/hive/build.properties
 [property] Loading /home/phoenix/build.properties
 [property] Unable to find property file: /home/phoenix/build.properties
 [property] Loading /master/hadoop/hive/build.properties
Override ignored for property "jetty.test.jar"
Override ignored for property "javac.version"
Override ignored for property "hadoop.root.default"
Override ignored for property "version"
Override ignored for property "Name"
Override ignored for property "hadoop.jar"
Override ignored for property "hadoop.version.ant-internal"
Override ignored for property "javac.debug"
Override ignored for property "common.jar"
Override ignored for property "hadoop.version"
Override ignored for property "build.dir.hadoop"
Override ignored for property "build.dir.hive"
Override ignored for property "hadoop.root"
Override ignored for property "jasper.test.jar"
Override ignored for property "javac.optimize"
Override ignored for property "hadoop.test.jar"
Override ignored for property "year"
Override ignored for property "servlet.test.jar"
Override ignored for property "jasperc.test.jar"
Override ignored for property "hadoop.mirror"
Override ignored for property "javac.args"
Override ignored for property "jsp.test.jar"
Override ignored for property "javac.args.warnings"
Override ignored for property "javac.deprecation"
Override ignored for property "name"
 [macrodef] creating macro  macro_tar
 [macrodef] creating macro  iterate-cpp
 [macrodef] creating macro  iterate-all
 [macrodef] creating macro  iterate
Importing file /master/hadoop/hive/build-common.xml from /master/hadoop/hive/build.xml
parsing buildfile /master/hadoop/hive/build-common.xml with URI = file:/master/hadoop/hive/build-common.xml
Already defined in main or a previous import, ignore init
Already defined in main or a previous import, ignore test-init
Already defined in main or a previous import, ignore compile
Already defined in main or a previous import, ignore jar
Already defined in main or a previous import, ignore test
Already defined in main or a previous import, ignore clean-test
Already defined in main or a previous import, ignore clean
Override ignored for property "hive.root"
 [property] Loading /master/hadoop/hive/build.properties
Override ignored for property "jetty.test.jar"
Override ignored for property "javac.version"
Override ignored for property "hadoop.root.default"
Override ignored for property "version"
Override ignored for property "Name"
Override ignored for property "hadoop.jar"
Override ignored for property "hadoop.version.ant-internal"
Override ignored for property "javac.debug"
Override ignored for property "common.jar"
Override ignored for property "hadoop.version"
Override ignored for property "build.dir.hadoop"
Override ignored for property "build.dir.hive"
Override ignored for property "hadoop.root"
Override ignored for property "jasper.test.jar"
Override ignored for property "javac.optimize"
Override ignored for property "hadoop.test.jar"
Override ignored for property "year"
Override ignored for property "servlet.test.jar"
Override ignored for property "jasperc.test.jar"
Override ignored for property "hadoop.mirror"
Override ignored for property "javac.args"
Override ignored for property "jsp.test.jar"
Override ignored for property "javac.args.warnings"
Override ignored for property "javac.deprecation"
Override ignored for property "name"
 [property] Loading /home/phoenix/build.properties
 [property] Unable to find property file: /home/phoenix/build.properties
 [property] Loading /master/hadoop/hive/build.properties
Override ignored for property "jetty.test.jar"
Override ignored for property "javac.version"
Override ignored for property "hadoop.root.default"
Override ignored for property "version"
Override ignored for property "Name"
Override ignored for property "hadoop.jar"
Override ignored for property "hadoop.version.ant-internal"
Override ignored for property "javac.debug"
Override ignored for property "common.jar"
Override ignored for property "hadoop.version"
Override ignored for property "build.dir.hadoop"
Override ignored for property "build.dir.hive"
Override ignored for property "hadoop.root"
Override ignored for property "jasper.test.jar"
Override ignored for property "javac.optimize"
Override ignored for property "hadoop.test.jar"
Override ignored for property "year"
Override ignored for property "servlet.test.jar"
Override ignored for property "jasperc.test.jar"
Override ignored for property "hadoop.mirror"
Override ignored for property "javac.args"
Override ignored for property "jsp.test.jar"
Override ignored for property "javac.args.warnings"
Override ignored for property "javac.deprecation"
Override ignored for property "name"
Override ignored for property "build.dir.hive"
Override ignored for property "build.dir.hadoop"
Override ignored for property "test.build.dir"
Override ignored for property "test.data.dir"
Property "env.IVY_HOME" has not been set
[available] Unable to find build/hadoopcore/hadoop-0.20.1.installed to set property hadoopcore.0.20.1.install.done
Overriding previous definition of reference to common-classpath
Overriding previous definition of reference to classpath
Build sequence for target(s) `package' is [jar, package]
Complete build sequence is [jar, package, ivy-init-dirs, javadoc, hivecommon.test-conditions, test-init, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-resolve-checkstyle, ivy-retrieve-checkstyle, check-for-checkstyle, checkstyle, ivy-resolve, ivy-retrieve, create-dirs, hivecommon.install-hadoopcore-default, hivecommon.ivy-init-antlib, init, compile-cpp, hivecommon.ivy-init, clean-eclipse-files, clean-test, test, hivecommon.ivy-retrieve, setup, test-conditions, gen-test, compile, compile-test, test-jar, hivecommon.test, hivecommon.ivy-retrieve-hadoop-source, hivecommon.clean-test, docs, binary, testreport, hivecommon.clean, compile-ant-tasks, deploy-ant-tasks, hivecommon.ivy-init-dirs, hivecommon.deploy-ant-tasks, ivy-retrieve-hadoop-source, hivecommon.install-hadoopcore-internal, compile-cpp-clean, gen-testdata, install-hadoopcore-default, install-hadoopcore, hivecommon.compile, hivecommon.ivy-resolve-checkstyle, hivecommon.gen-test, install-hadoopcore-internal, package-cpp, hivecommon.ivy-probe-antlib, hivecommon.ivy-resolve, hivecommon.test-init, tar, hivecommon.jar, clean, hivecommon.compile-ant-tasks, hivecommon.compile-test, hivecommon.setup, hivecommon.ivy-retrieve-checkstyle, eclipse-files, hivecommon.install-hadoopcore, hivecommon.test-jar, hivecommon.init, hivecommon.create-dirs, changes-to-html, ]

jar:
Detected Java version: 1.6 in: /usr/lib/jvm/java-6-sun-1.6.0.15/jre
Detected OS: Linux
   [subant] calling target(s) [jar] in build file /master/hadoop/hive/shims/build.xml
parsing buildfile /master/hadoop/hive/shims/build.xml with URI = file:/master/hadoop/hive/shims/build.xml
Project base dir set to: /master/hadoop/hive/shims
Importing file /master/hadoop/hive/build-common.xml from /master/hadoop/hive/shims/build.xml
parsing buildfile /master/hadoop/hive/build-common.xml with URI = file:/master/hadoop/hive/build-common.xml
Already defined in main or a previous import, ignore compile
Already defined in main or a previous import, ignore test
 [property] Loading /master/hadoop/hive/build.properties
Override ignored for property "build.dir.hive"
 [property] Loading /home/phoenix/build.properties
 [property] Unable to find property file: /home/phoenix/build.properties
 [property] Loading /master/hadoop/hive/shims/build.properties
 [property] Unable to find property file: /master/hadoop/hive/shims/build.properties
Override ignored for property "build.dir.hive"
Override ignored for property "build.dir.hadoop"
Property "env.IVY_HOME" has not been set
[available] Unable to find /master/hadoop/hive/build/hadoopcore/hadoop-0.20.1.installed to set property hadoopcore.0.20.1.install.done
Overriding previous definition of reference to classpath
   [subant] Entering /master/hadoop/hive/shims/build.xml...
Build sequence for target(s) `jar' is [create-dirs, compile-ant-tasks, deploy-ant-tasks, init, compile, jar]
Complete build sequence is [create-dirs, compile-ant-tasks, deploy-ant-tasks, init, compile, jar, ivy-init-dirs, hivecommon.test-conditions, test-init, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-resolve, ivy-retrieve, hivecommon.install-hadoopcore-default, hivecommon.ivy-init-antlib, hivecommon.ivy-init, test, hivecommon.ivy-retrieve, setup, test-conditions, gen-test, compile-test, test-jar, hivecommon.test, hivecommon.ivy-retrieve-hadoop-source, hivecommon.clean-test, hivecommon.clean, hivecommon.ivy-init-dirs, clean-test, hivecommon.deploy-ant-tasks, ivy-retrieve-hadoop-source, hivecommon.install-hadoopcore-internal, install-hadoopcore-default, install-hadoopcore, hivecommon.compile, hivecommon.ivy-resolve-checkstyle, hivecommon.gen-test, install-hadoopcore-internal, hivecommon.ivy-probe-antlib, hivecommon.ivy-resolve, hivecommon.test-init, clean, hivecommon.jar, hivecommon.compile-ant-tasks, hivecommon.compile-test, hivecommon.setup, ivy-resolve-checkstyle, hivecommon.ivy-retrieve-checkstyle, ivy-retrieve-checkstyle, build_shims, hivecommon.install-hadoopcore, hivecommon.test-jar, hivecommon.init, hivecommon.create-dirs, ]

create-dirs:
    [mkdir] Skipping /master/hadoop/hive/build because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/shims because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/shims/classes because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/jexl/classes because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/hadoopcore because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/shims/test because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/shims/test/src because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/shims/test/classes because it already exists.

compile-ant-tasks:
Detected Java version: 1.6 in: /usr/lib/jvm/java-6-sun-1.6.0.15/jre
Detected OS: Linux
   [subant] calling target(s) [compile] in build file /master/hadoop/hive/ant/build.xml
parsing buildfile /master/hadoop/hive/ant/build.xml with URI = file:/master/hadoop/hive/ant/build.xml
Project base dir set to: /master/hadoop/hive/ant
Importing file /master/hadoop/hive/build-common.xml from /master/hadoop/hive/ant/build.xml
parsing buildfile /master/hadoop/hive/build-common.xml with URI = file:/master/hadoop/hive/build-common.xml
Already defined in main or a previous import, ignore init
Already defined in main or a previous import, ignore compile
Already defined in main or a previous import, ignore jar
 [property] Loading /master/hadoop/hive/build.properties
Override ignored for property "build.dir.hive"
 [property] Loading /home/phoenix/build.properties
 [property] Unable to find property file: /home/phoenix/build.properties
 [property] Loading /master/hadoop/hive/ant/build.properties
 [property] Unable to find property file: /master/hadoop/hive/ant/build.properties
Override ignored for property "build.dir.hive"
Override ignored for property "build.dir.hadoop"
Property "env.IVY_HOME" has not been set
[available] Unable to find /master/hadoop/hive/build/hadoopcore/hadoop-0.20.1.installed to set property hadoopcore.0.20.1.install.done
   [subant] Entering /master/hadoop/hive/ant/build.xml...
Build sequence for target(s) `compile' is [create-dirs, init, compile]
Complete build sequence is [create-dirs, init, compile, ivy-init-dirs, hivecommon.test-conditions, test-init, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-resolve, ivy-retrieve, hivecommon.install-hadoopcore-default, hivecommon.ivy-init-antlib, hivecommon.ivy-init, test-conditions, gen-test, compile-test, test-jar, test, hivecommon.ivy-retrieve, setup, hivecommon.test, hivecommon.ivy-retrieve-hadoop-source, hivecommon.clean-test, hivecommon.clean, compile-ant-tasks, deploy-ant-tasks, hivecommon.ivy-init-dirs, clean-test, hivecommon.deploy-ant-tasks, ivy-retrieve-hadoop-source, hivecommon.install-hadoopcore-internal, install-hadoopcore-default, install-hadoopcore, hivecommon.compile, hivecommon.ivy-resolve-checkstyle, hivecommon.gen-test, install-hadoopcore-internal, hivecommon.ivy-probe-antlib, hivecommon.ivy-resolve, hivecommon.test-init, clean, hivecommon.jar, hivecommon.compile-ant-tasks, hivecommon.compile-test, ivy-resolve-checkstyle, hivecommon.ivy-retrieve-checkstyle, hivecommon.setup, ivy-retrieve-checkstyle, jar, hivecommon.install-hadoopcore, hivecommon.test-jar, hivecommon.init, hivecommon.create-dirs, ]

create-dirs:
    [mkdir] Skipping /master/hadoop/hive/build because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks/classes because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/jexl/classes because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/hadoopcore because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks/test because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks/test/src because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks/test/classes because it already exists.

init:

compile:
     [echo] Compiling: anttasks
    [javac] org/apache/hadoop/hive/ant/GetVersionPref.java omitted as /master/hadoop/hive/build/anttasks/classes/org/apache/hadoop/hive/ant/GetVersionPref.class is up to date.
    [javac] org/apache/hadoop/hive/ant/QTestGenTask.java omitted as /master/hadoop/hive/build/anttasks/classes/org/apache/hadoop/hive/ant/QTestGenTask.class is up to date.
   [subant] Exiting /master/hadoop/hive/ant/build.xml.

deploy-ant-tasks:
Detected Java version: 1.6 in: /usr/lib/jvm/java-6-sun-1.6.0.15/jre
Detected OS: Linux
   [subant] calling target(s) [jar] in build file /master/hadoop/hive/ant/build.xml
parsing buildfile /master/hadoop/hive/ant/build.xml with URI = file:/master/hadoop/hive/ant/build.xml
Project base dir set to: /master/hadoop/hive/ant
Importing file /master/hadoop/hive/build-common.xml from /master/hadoop/hive/ant/build.xml
parsing buildfile /master/hadoop/hive/build-common.xml with URI = file:/master/hadoop/hive/build-common.xml
Already defined in main or a previous import, ignore init
Already defined in main or a previous import, ignore compile
Already defined in main or a previous import, ignore jar
 [property] Loading /master/hadoop/hive/build.properties
Override ignored for property "build.dir.hive"
 [property] Loading /home/phoenix/build.properties
 [property] Unable to find property file: /home/phoenix/build.properties
 [property] Loading /master/hadoop/hive/ant/build.properties
 [property] Unable to find property file: /master/hadoop/hive/ant/build.properties
Override ignored for property "build.dir.hive"
Override ignored for property "build.dir.hadoop"
Property "env.IVY_HOME" has not been set
[available] Unable to find /master/hadoop/hive/build/hadoopcore/hadoop-0.20.1.installed to set property hadoopcore.0.20.1.install.done
   [subant] Entering /master/hadoop/hive/ant/build.xml...
Build sequence for target(s) `jar' is [create-dirs, init, compile, jar]
Complete build sequence is [create-dirs, init, compile, jar, ivy-init-dirs, hivecommon.test-conditions, test-init, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-resolve, ivy-retrieve, hivecommon.install-hadoopcore-default, hivecommon.ivy-init-antlib, hivecommon.ivy-init, test-conditions, gen-test, compile-test, test-jar, test, hivecommon.ivy-retrieve, setup, hivecommon.test, hivecommon.ivy-retrieve-hadoop-source, hivecommon.clean-test, hivecommon.clean, compile-ant-tasks, deploy-ant-tasks, hivecommon.ivy-init-dirs, clean-test, hivecommon.deploy-ant-tasks, ivy-retrieve-hadoop-source, hivecommon.install-hadoopcore-internal, install-hadoopcore-default, install-hadoopcore, hivecommon.compile, hivecommon.ivy-resolve-checkstyle, hivecommon.gen-test, install-hadoopcore-internal, hivecommon.ivy-probe-antlib, hivecommon.ivy-resolve, hivecommon.test-init, clean, hivecommon.jar, hivecommon.compile-ant-tasks, hivecommon.compile-test, ivy-resolve-checkstyle, hivecommon.ivy-retrieve-checkstyle, hivecommon.setup, ivy-retrieve-checkstyle, hivecommon.install-hadoopcore, hivecommon.test-jar, hivecommon.init, hivecommon.create-dirs, ]

create-dirs:
    [mkdir] Skipping /master/hadoop/hive/build because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks/classes because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/jexl/classes because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/hadoopcore because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks/test because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks/test/src because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/anttasks/test/classes because it already exists.

init:

compile:
     [echo] Compiling: anttasks
    [javac] org/apache/hadoop/hive/ant/GetVersionPref.java omitted as /master/hadoop/hive/build/anttasks/classes/org/apache/hadoop/hive/ant/GetVersionPref.class is up to date.
    [javac] org/apache/hadoop/hive/ant/QTestGenTask.java omitted as /master/hadoop/hive/build/anttasks/classes/org/apache/hadoop/hive/ant/QTestGenTask.class is up to date.

jar:
     [copy] /master/hadoop/hive/ant/src/org/apache/hadoop/hive/ant/antlib.xml omitted as /master/hadoop/hive/build/anttasks/classes/org/apache/hadoop/hive/ant/antlib.xml is up to date.
      [jar] org omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/ is up to date.
      [jar] org/apache omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/apache/ is up to date.
      [jar] org/apache/hadoop omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/apache/hadoop/ is up to date.
      [jar] org/apache/hadoop/hive omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/apache/hadoop/hive/ is up to date.
      [jar] org/apache/hadoop/hive/ant omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/apache/hadoop/hive/ant/ is up to date.
      [jar] org/apache/hadoop/hive/ant/GetVersionPref.class omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/apache/hadoop/hive/ant/GetVersionPref.class is up to date.
      [jar] org/apache/hadoop/hive/ant/QTestGenTask$QFileFilter.class omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/apache/hadoop/hive/ant/QTestGenTask$QFileFilter.class is up to date.
      [jar] org/apache/hadoop/hive/ant/QTestGenTask$QFileRegexFilter.class omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/apache/hadoop/hive/ant/QTestGenTask$QFileRegexFilter.class is up to date.
      [jar] org/apache/hadoop/hive/ant/QTestGenTask.class omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/apache/hadoop/hive/ant/QTestGenTask.class is up to date.
      [jar] org/apache/hadoop/hive/ant/antlib.xml omitted as /master/hadoop/hive/build/anttasks/hive-anttasks-0.6.0.jar:org/apache/hadoop/hive/ant/antlib.xml is up to date.
      [jar] No Implementation-Title set.No Implementation-Version set.No Implementation-Vendor set.
      [jar] Location: /master/hadoop/hive/ant/build.xml:49:
   [subant] Exiting /master/hadoop/hive/ant/build.xml.

init:

compile:
Detected Java version: 1.6 in: /usr/lib/jvm/java-6-sun-1.6.0.15/jre
Detected OS: Linux
  [antcall] calling target(s) [build_shims] in build file /master/hadoop/hive/shims/build.xml
parsing buildfile /master/hadoop/hive/shims/build.xml with URI = file:/master/hadoop/hive/shims/build.xml
Project base dir set to: /master/hadoop/hive/shims
Importing file /master/hadoop/hive/build-common.xml from /master/hadoop/hive/shims/build.xml
parsing buildfile /master/hadoop/hive/build-common.xml with URI = file:/master/hadoop/hive/build-common.xml
Already defined in main or a previous import, ignore compile
Already defined in main or a previous import, ignore test
 [property] Loading /master/hadoop/hive/build.properties
Override ignored for property "hadoop.version.ant-internal"
Override ignored for property "build.dir.hive"
 [property] Loading /home/phoenix/build.properties
 [property] Unable to find property file: /home/phoenix/build.properties
 [property] Loading /master/hadoop/hive/shims/build.properties
 [property] Unable to find property file: /master/hadoop/hive/shims/build.properties
Override ignored for property "build.dir.hive"
Override ignored for property "build.dir.hadoop"
Property "env.IVY_HOME" has not been set
[available] Unable to find /master/hadoop/hive/build/hadoopcore/hadoop-0.20.1.installed to set property hadoopcore.0.20.1.install.done
Overriding previous definition of reference to classpath
Build sequence for target(s) `build_shims' is [ivy-init-dirs, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-retrieve-hadoop-source, install-hadoopcore-internal, build_shims]
Complete build sequence is [ivy-init-dirs, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-retrieve-hadoop-source, install-hadoopcore-internal, build_shims, hivecommon.test-conditions, test-init, ivy-resolve, ivy-retrieve, hivecommon.install-hadoopcore-default, create-dirs, hivecommon.ivy-init-antlib, hivecommon.ivy-init, test, hivecommon.ivy-retrieve, setup, test-conditions, gen-test, compile-ant-tasks, deploy-ant-tasks, init, compile, compile-test, test-jar, hivecommon.test, hivecommon.ivy-retrieve-hadoop-source, hivecommon.clean-test, hivecommon.clean, hivecommon.ivy-init-dirs, clean-test, hivecommon.deploy-ant-tasks, hivecommon.install-hadoopcore-internal, install-hadoopcore-default, install-hadoopcore, hivecommon.compile, hivecommon.ivy-resolve-checkstyle, hivecommon.gen-test, hivecommon.ivy-probe-antlib, hivecommon.ivy-resolve, hivecommon.test-init, clean, hivecommon.jar, hivecommon.compile-ant-tasks, hivecommon.compile-test, hivecommon.setup, ivy-resolve-checkstyle, hivecommon.ivy-retrieve-checkstyle, ivy-retrieve-checkstyle, jar, hivecommon.install-hadoopcore, hivecommon.test-jar, hivecommon.init, hivecommon.create-dirs, ]
  [antcall] Entering /master/hadoop/hive/shims/build.xml...
Build sequence for target(s) `build_shims' is [ivy-init-dirs, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-retrieve-hadoop-source, install-hadoopcore-internal, build_shims]
Complete build sequence is [ivy-init-dirs, ivy-probe-antlib, ivy-init-antlib, ivy-init, ivy-retrieve-hadoop-source, install-hadoopcore-internal, build_shims, hivecommon.test-conditions, test-init, ivy-resolve, ivy-retrieve, hivecommon.install-hadoopcore-default, create-dirs, hivecommon.ivy-init-antlib, hivecommon.ivy-init, test, hivecommon.ivy-retrieve, setup, test-conditions, gen-test, compile-ant-tasks, deploy-ant-tasks, init, compile, compile-test, test-jar, hivecommon.test, hivecommon.ivy-retrieve-hadoop-source, hivecommon.clean-test, hivecommon.clean, hivecommon.ivy-init-dirs, clean-test, hivecommon.deploy-ant-tasks, hivecommon.install-hadoopcore-internal, install-hadoopcore-default, install-hadoopcore, hivecommon.compile, hivecommon.ivy-resolve-checkstyle, hivecommon.gen-test, hivecommon.ivy-probe-antlib, hivecommon.ivy-resolve, hivecommon.test-init, clean, hivecommon.jar, hivecommon.compile-ant-tasks, hivecommon.compile-test, hivecommon.setup, ivy-resolve-checkstyle, hivecommon.ivy-retrieve-checkstyle, ivy-retrieve-checkstyle, jar, hivecommon.install-hadoopcore, hivecommon.test-jar, hivecommon.init, hivecommon.create-dirs, ]

ivy-init-dirs:
    [mkdir] Skipping /master/hadoop/hive/build/ivy because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/ivy/lib because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/ivy/report because it already exists.
    [mkdir] Skipping /master/hadoop/hive/build/ivy/maven because it already exists.

ivy-probe-antlib:
parsing buildfile jar:file:/usr/share/ant/lib/ant.jar!/org/apache/tools/ant/types/conditions/antlib.xml with URI = jar:file:/usr/share/ant/lib/ant.jar!/org/apache/tools/ant/types/conditions/antlib.xml
[antlib:org.apache.ivy.ant] Could not load definitions from resource org/apache/ivy/ant/antlib.xml. It could not be found.

ivy-init-antlib:
dropping /master/hadoop/hive/build/ivy/lib/ivy-2.0.0-rc2.jar from path as it doesn't exist
  [typedef] Could not load definitions from resource org/apache/ivy/ant/antlib.xml. It could not be found.
  [antcall] Exiting /master/hadoop/hive/shims/build.xml.
   [subant] Exiting /master/hadoop/hive/shims/build.xml.

BUILD FAILED
/master/hadoop/hive/build.xml:148: The following error occurred while executing this line:
/master/hadoop/hive/build.xml:93: The following error occurred while executing this line:
/master/hadoop/hive/shims/build.xml:64: The following error occurred while executing this line:
/master/hadoop/hive/build-common.xml:122: You need Apache Ivy 2.0 or later from http://ant.apache.org/
      It could not be loaded from http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
        at org.apache.tools.ant.ProjectHelper.addLocationToBuildException(ProjectHelper.java:508)
        at org.apache.tools.ant.taskdefs.MacroInstance.execute(MacroInstance.java:397)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.Target.execute(Target.java:357)
        at org.apache.tools.ant.Target.performTasks(Target.java:385)
        at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1337)
        at org.apache.tools.ant.Project.executeTarget(Project.java:1306)
        at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41)
        at org.apache.tools.ant.Project.executeTargets(Project.java:1189)
        at org.apache.tools.ant.Main.runBuild(Main.java:758)
        at org.apache.tools.ant.Main.startAnt(Main.java:217)
        at org.apache.tools.ant.launch.Launcher.run(Launcher.java:257)
        at org.apache.tools.ant.launch.Launcher.main(Launcher.java:104)
Caused by: /master/hadoop/hive/build.xml:93: The following error occurred while executing this line:
/master/hadoop/hive/shims/build.xml:64: The following error occurred while executing this line:
/master/hadoop/hive/build-common.xml:122: You need Apache Ivy 2.0 or later from http://ant.apache.org/
      It could not be loaded from http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
        at org.apache.tools.ant.ProjectHelper.addLocationToBuildException(ProjectHelper.java:508)
        at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:418)
        at org.apache.tools.ant.taskdefs.SubAnt.execute(SubAnt.java:289)
        at org.apache.tools.ant.taskdefs.SubAnt.execute(SubAnt.java:208)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.taskdefs.Sequential.execute(Sequential.java:62)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.taskdefs.MacroInstance.execute(MacroInstance.java:394)
        ... 16 more
Caused by: /master/hadoop/hive/shims/build.xml:64: The following error occurred while executing this line:
/master/hadoop/hive/build-common.xml:122: You need Apache Ivy 2.0 or later from http://ant.apache.org/
      It could not be loaded from http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
        at org.apache.tools.ant.ProjectHelper.addLocationToBuildException(ProjectHelper.java:508)
        at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:418)
        at org.apache.tools.ant.taskdefs.CallTarget.execute(CallTarget.java:105)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.Target.execute(Target.java:357)
        at org.apache.tools.ant.Target.performTasks(Target.java:385)
        at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1337)
        at org.apache.tools.ant.helper.SingleCheckExecutor.executeTargets(SingleCheckExecutor.java:38)
        at org.apache.tools.ant.Project.executeTargets(Project.java:1189)
        at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:416)
        ... 32 more
Caused by: /master/hadoop/hive/build-common.xml:122: You need Apache Ivy 2.0 or later from http://ant.apache.org/
      It could not be loaded from http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
        at org.apache.tools.ant.taskdefs.Exit.execute(Exit.java:142)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.Target.execute(Target.java:357)
        at org.apache.tools.ant.Target.performTasks(Target.java:385)
        at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1337)
        at org.apache.tools.ant.helper.SingleCheckExecutor.executeTargets(SingleCheckExecutor.java:38)
        at org.apache.tools.ant.Project.executeTargets(Project.java:1189)
        at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:416)
        ... 45 more
--- Nested Exception ---
/master/hadoop/hive/build.xml:93: The following error occurred while executing this line:
/master/hadoop/hive/shims/build.xml:64: The following error occurred while executing this line:
/master/hadoop/hive/build-common.xml:122: You need Apache Ivy 2.0 or later from http://ant.apache.org/
      It could not be loaded from http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
        at org.apache.tools.ant.ProjectHelper.addLocationToBuildException(ProjectHelper.java:508)
        at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:418)
        at org.apache.tools.ant.taskdefs.SubAnt.execute(SubAnt.java:289)
        at org.apache.tools.ant.taskdefs.SubAnt.execute(SubAnt.java:208)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.taskdefs.Sequential.execute(Sequential.java:62)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.taskdefs.MacroInstance.execute(MacroInstance.java:394)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.Target.execute(Target.java:357)
        at org.apache.tools.ant.Target.performTasks(Target.java:385)
        at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1337)
        at org.apache.tools.ant.Project.executeTarget(Project.java:1306)
        at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41)
        at org.apache.tools.ant.Project.executeTargets(Project.java:1189)
        at org.apache.tools.ant.Main.runBuild(Main.java:758)
        at org.apache.tools.ant.Main.startAnt(Main.java:217)
        at org.apache.tools.ant.launch.Launcher.run(Launcher.java:257)
        at org.apache.tools.ant.launch.Launcher.main(Launcher.java:104)
Caused by: /master/hadoop/hive/shims/build.xml:64: The following error occurred while executing this line:
/master/hadoop/hive/build-common.xml:122: You need Apache Ivy 2.0 or later from http://ant.apache.org/
      It could not be loaded from http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
        at org.apache.tools.ant.ProjectHelper.addLocationToBuildException(ProjectHelper.java:508)
        at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:418)
        at org.apache.tools.ant.taskdefs.CallTarget.execute(CallTarget.java:105)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.Target.execute(Target.java:357)
        at org.apache.tools.ant.Target.performTasks(Target.java:385)
        at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1337)
        at org.apache.tools.ant.helper.SingleCheckExecutor.executeTargets(SingleCheckExecutor.java:38)
        at org.apache.tools.ant.Project.executeTargets(Project.java:1189)
        at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:416)
        ... 32 more
Caused by: /master/hadoop/hive/build-common.xml:122: You need Apache Ivy 2.0 or later from http://ant.apache.org/
      It could not be loaded from http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
        at org.apache.tools.ant.taskdefs.Exit.execute(Exit.java:142)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.Target.execute(Target.java:357)
        at org.apache.tools.ant.Target.performTasks(Target.java:385)
        at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1337)
        at org.apache.tools.ant.helper.SingleCheckExecutor.executeTargets(SingleCheckExecutor.java:38)
        at org.apache.tools.ant.Project.executeTargets(Project.java:1189)
        at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:416)
        ... 45 more
--- Nested Exception ---
/master/hadoop/hive/shims/build.xml:64: The following error occurred while executing this line:
/master/hadoop/hive/build-common.xml:122: You need Apache Ivy 2.0 or later from http://ant.apache.org/
      It could not be loaded from http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
        at org.apache.tools.ant.ProjectHelper.addLocationToBuildException(ProjectHelper.java:508)
        at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:418)
        at org.apache.tools.ant.taskdefs.CallTarget.execute(CallTarget.java:105)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.Target.execute(Target.java:357)
        at org.apache.tools.ant.Target.performTasks(Target.java:385)
        at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1337)
        at org.apache.tools.ant.helper.SingleCheckExecutor.executeTargets(SingleCheckExecutor.java:38)
        at org.apache.tools.ant.Project.executeTargets(Project.java:1189)
        at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:416)
        at org.apache.tools.ant.taskdefs.SubAnt.execute(SubAnt.java:289)
        at org.apache.tools.ant.taskdefs.SubAnt.execute(SubAnt.java:208)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.taskdefs.Sequential.execute(Sequential.java:62)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.taskdefs.MacroInstance.execute(MacroInstance.java:394)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.Target.execute(Target.java:357)
        at org.apache.tools.ant.Target.performTasks(Target.java:385)
        at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1337)
        at org.apache.tools.ant.Project.executeTarget(Project.java:1306)
        at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41)
        at org.apache.tools.ant.Project.executeTargets(Project.java:1189)
        at org.apache.tools.ant.Main.runBuild(Main.java:758)
        at org.apache.tools.ant.Main.startAnt(Main.java:217)
        at org.apache.tools.ant.launch.Launcher.run(Launcher.java:257)
        at org.apache.tools.ant.launch.Launcher.main(Launcher.java:104)
Caused by: /master/hadoop/hive/build-common.xml:122: You need Apache Ivy 2.0 or later from http://ant.apache.org/
      It could not be loaded from http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
        at org.apache.tools.ant.taskdefs.Exit.execute(Exit.java:142)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.Target.execute(Target.java:357)
        at org.apache.tools.ant.Target.performTasks(Target.java:385)
        at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1337)
        at org.apache.tools.ant.helper.SingleCheckExecutor.executeTargets(SingleCheckExecutor.java:38)
        at org.apache.tools.ant.Project.executeTargets(Project.java:1189)
        at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:416)
        ... 45 more
--- Nested Exception ---
/master/hadoop/hive/build-common.xml:122: You need Apache Ivy 2.0 or later from http://ant.apache.org/
      It could not be loaded from http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.0.0-rc2/ivy-2.0.0-rc2.jar
        at org.apache.tools.ant.taskdefs.Exit.execute(Exit.java:142)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.Target.execute(Target.java:357)
        at org.apache.tools.ant.Target.performTasks(Target.java:385)
        at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1337)
        at org.apache.tools.ant.helper.SingleCheckExecutor.executeTargets(SingleCheckExecutor.java:38)
        at org.apache.tools.ant.Project.executeTargets(Project.java:1189)
        at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:416)
        at org.apache.tools.ant.taskdefs.CallTarget.execute(CallTarget.java:105)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.Target.execute(Target.java:357)
        at org.apache.tools.ant.Target.performTasks(Target.java:385)
        at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1337)
        at org.apache.tools.ant.helper.SingleCheckExecutor.executeTargets(SingleCheckExecutor.java:38)
        at org.apache.tools.ant.Project.executeTargets(Project.java:1189)
        at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:416)
        at org.apache.tools.ant.taskdefs.SubAnt.execute(SubAnt.java:289)
        at org.apache.tools.ant.taskdefs.SubAnt.execute(SubAnt.java:208)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.taskdefs.Sequential.execute(Sequential.java:62)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.taskdefs.MacroInstance.execute(MacroInstance.java:394)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
        at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.Target.execute(Target.java:357)
        at org.apache.tools.ant.Target.performTasks(Target.java:385)
        at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1337)
        at org.apache.tools.ant.Project.executeTarget(Project.java:1306)
        at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41)
        at org.apache.tools.ant.Project.executeTargets(Project.java:1189)
        at org.apache.tools.ant.Main.runBuild(Main.java:758)
        at org.apache.tools.ant.Main.startAnt(Main.java:217)
        at org.apache.tools.ant.launch.Launcher.run(Launcher.java:257)
        at org.apache.tools.ant.launch.Launcher.main(Launcher.java:104)

Total time: 0 seconds



-----Original Message-----
From: John Sichi [mailto:jsichi@facebook.com]
Sent: Saturday, February 06, 2010 7:39 AM
To: hive-user@hadoop.apache.org
Subject: Re: Hive Installation Problem

By the way, the current IVY_HOME detection in build-common.xml is broken because it doesn't do:

<property environment="env"/>

first.

I'll log a JIRA issue for it, but it seems there are other problems with it even after I fix that since the build is currenlty installing ivy under build/ivy rather than under ${ivy.home}; nothing else in build-common.xml references ivy.home.

JVS

On Feb 5, 2010, at 1:15 PM, Zheng Shao wrote:

> HI guys,
>
> Can you have a try to make the following directory the same as mine?
> Once this is done, remove the "build" directory, and run "ant package".
>
> Does this solve the problem?
>
>
>
> [zshao@dev ~/.ant] ls -lR
> .:
> total 3896
> drwxr-xr-x  2 zshao users    4096 Feb  5 13:04 apache-ivy-2.0.0-rc2
> -rw-r--r--  1 zshao users 3965953 Nov  4  2008 apache-ivy-2.0.0-rc2-bin.zip
> -rw-r--r--  1 zshao users       0 Feb  5 13:04 apache-ivy-2.0.0-rc2.installed
> drwxr-xr-x  3 zshao users    4096 Feb  5 13:07 cache
> drwxr-xr-x  2 zshao users    4096 Feb  5 13:04 lib
>
> ./apache-ivy-2.0.0-rc2:
> total 880
> -rw-r--r--  1 zshao users 893199 Oct 28  2008 ivy-2.0.0-rc2.jar
>
> ./cache:
> total 4
> drwxr-xr-x  3 zshao users 4096 Feb  4 19:30 hadoop
>
> ./cache/hadoop:
> total 4
> drwxr-xr-x  3 zshao users 4096 Feb  5 13:08 core
>
> ./cache/hadoop/core:
> total 4
> drwxr-xr-x  2 zshao users 4096 Feb  4 19:30 sources
>
> ./cache/hadoop/core/sources:
> total 127436
> -rw-r--r--  1 zshao users 14427013 Aug 20  2008 hadoop-0.17.2.1.tar.gz
> -rw-r--r--  1 zshao users 30705253 Jan 22  2009 hadoop-0.18.3.tar.gz
> -rw-r--r--  1 zshao users 42266180 Nov 13  2008 hadoop-0.19.0.tar.gz
> -rw-r--r--  1 zshao users 42813980 Apr  8  2009 hadoop-0.20.0.tar.gz
>
> ./lib:
> total 880
> -rw-r--r--  1 zshao users 893199 Feb  5 13:04 ivy-2.0.0-rc2.jar
>
>
> Zheng
>
> On Fri, Feb 5, 2010 at 5:49 AM, Vidyasagar Venkata Nallapati
> <vi...@onmobile.com> wrote:
>> Hi ,
>>
>>
>>
>> We are still getting the problem
>>
>>
>>
>> [ivy:retrieve] no resolved descriptor found: launching default resolve
>>
>> Overriding previous definition of property "ivy.version"
>>
>> [ivy:retrieve] using ivy parser to parse
>> file:/master/hadoop/hive/shims/ivy.xml
>>
>> [ivy:retrieve] :: resolving dependencies ::
>> org.apache.hadoop.hive#shims;working@ph1
>>
>> [ivy:retrieve]  confs: [default]
>>
>> [ivy:retrieve]  validate = true
>>
>> [ivy:retrieve]  refresh = false
>>
>> [ivy:retrieve] resolving dependencies for configuration 'default'
>>
>> [ivy:retrieve] == resolving dependencies for
>> org.apache.hadoop.hive#shims;working@ph1 [default]
>>
>> [ivy:retrieve] == resolving dependencies
>> org.apache.hadoop.hive#shims;working@ph1->hadoop#core;0.20.1 [default->*]
>>
>> [ivy:retrieve] default: Checking cache for: dependency: hadoop#core;0.20.1
>> {*=[*]}
>>
>> [ivy:retrieve]  hadoop-source: no ivy file nor artifact found for
>> hadoop#core;0.20.1
>>
>> [ivy:retrieve]          tried
>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>
>>
>>
>> And the .pom for this is not getting copied, please suggest something on
>> this.
>>
>>
>>
>> Regards
>>
>> Vidyasagar N V
>>
>>
>>
>> From: baburaj.S [mailto:baburaj.s@onmobile.com]
>> Sent: Friday, February 05, 2010 4:59 PM
>>
>> To: hive-user@hadoop.apache.org
>> Subject: RE: Hive Installation Problem
>>
>>
>>
>> No I don't have the variable defined. Any other things that I have to check.
>> Is this happening because I am trying for Hadoop 0.20.1
>>
>>
>>
>> Babu
>>
>>
>>
>>
>>
>> From: Carl Steinbach [mailto:carl@cloudera.com]
>> Sent: Friday, February 05, 2010 3:07 PM
>> To: hive-user@hadoop.apache.org
>> Subject: Re: Hive Installation Problem
>>
>>
>>
>> Hi Babu,
>>
>> ~/.ant/cache is the default Ivy cache directory for Hive, but if the
>> environment variable IVY_HOME
>> is set it will use $IVY_HOME/cache instead. Is it possible that you have
>> this environment
>> variable set to a value different than ~/.ant?
>>
>> On Fri, Feb 5, 2010 at 12:09 AM, baburaj.S <ba...@onmobile.com> wrote:
>>
>> I have tried the same but still the installation is giving the same error. I
>> don't know if it is looking in the cache . Can we make any change in
>> ivysettings.xml that it has to resolve the file from the file system rather
>> through an url.
>>
>> Babu
>>
>> -----Original Message-----
>> From: Zheng Shao [mailto:zshao9@gmail.com]
>> Sent: Friday, February 05, 2010 12:47 PM
>> To: hive-user@hadoop.apache.org
>> Subject: Re: Hive Installation Problem
>>
>> Added to http://wiki.apache.org/hadoop/Hive/FAQ
>>
>> Zheng
>>
>> On Thu, Feb 4, 2010 at 11:11 PM, Zheng Shao <zs...@gmail.com> wrote:
>>> Try this:
>>>
>>> cd ~/.ant/cache/hadoop/core/sources
>>> wget
>>> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>
>>>
>>> Zheng
>>>
>>> On Thu, Feb 4, 2010 at 10:23 PM, baburaj.S <ba...@onmobile.com> wrote:
>>>> Hello ,
>>>>
>>>> I am new to Hadoop and is trying to install Hive now. We have the
>>>> following setup at our side
>>>>
>>>> OS - Ubuntu 9.10
>>>> Hadoop - 0.20.1
>>>> Hive installation tried - 0.4.0 .
>>>>
>>>> The Hadoop is installed and is working fine . Now when we were installing
>>>> Hive I got error that it couldn't resolve the dependencies. I changed the
>>>> shims build and properties xml to make the dependencies look for Hadoop
>>>> 0.20.1 . But now when I call the ant script I get the following error
>>>>
>>>> ivy-retrieve-hadoop-source:
>>>> [ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 ::
>>>> http://ant.apache.org/ivy/ :
>>>> :: loading settings :: file = /master/hive/ivy/ivysettings.xml
>>>> [ivy:retrieve] :: resolving dependencies ::
>>>> org.apache.hadoop.hive#shims;working
>>>> [ivy:retrieve]  confs: [default]
>>>> [ivy:retrieve] :: resolution report :: resolve 953885ms :: artifacts dl
>>>> 0ms
>>>>
>>>> ---------------------------------------------------------------------
>>>>       |                  |            modules            ||   artifacts
>>>>  |
>>>>       |       conf       | number| search|dwnlded|evicted||
>>>> number|dwnlded|
>>>>
>>>> ---------------------------------------------------------------------
>>>>       |      default     |   1   |   0   |   0   |   0   ||   0   |   0
>>>>  |
>>>>
>>>> ---------------------------------------------------------------------
>>>> [ivy:retrieve]
>>>> [ivy:retrieve] :: problems summary ::
>>>> [ivy:retrieve] :::: WARNINGS
>>>> [ivy:retrieve]          module not found: hadoop#core;0.20.1
>>>> [ivy:retrieve]  ==== hadoop-source: tried
>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>> [ivy:retrieve]
>>>> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>> [ivy:retrieve]  ==== apache-snapshot: tried
>>>> [ivy:retrieve]
>>>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>> [ivy:retrieve]
>>>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>>>> [ivy:retrieve]  ==== maven2: tried
>>>> [ivy:retrieve]
>>>> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>> [ivy:retrieve]
>>>> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>> [ivy:retrieve]          ::          UNRESOLVED DEPENDENCIES         ::
>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>> [ivy:retrieve]          :: hadoop#core;0.20.1: not found
>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>> [ivy:retrieve] :::: ERRORS
>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>> url=http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>> url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>> url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>> url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>> url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>>>> [ivy:retrieve]
>>>> [ivy:retrieve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
>>>>
>>>> BUILD FAILED
>>>> /master/hive/build.xml:148: The following error occurred while executing
>>>> this line:
>>>> /master/hive/build.xml:93: The following error occurred while executing
>>>> this line:
>>>> /master/hive/shims/build.xml:64: The following error occurred while
>>>> executing this line:
>>>> /master/hive/build-common.xml:172: impossible to resolve dependencies:
>>>>       resolve failed - see output for details
>>>>
>>>> Total time: 15 minutes 55 seconds
>>>>
>>>>
>>>> I have even tried to download hadoop-0.20.1.tar.gz and put it in the ant
>>>> cache of the user . Still the same error is repeated. I am stuck and not
>>>> able to install it .
>>>>
>>>> Any help on the above will be greatly appreciated.
>>>>
>>>> Babu
>>>>
>>>>
>>>> DISCLAIMER: The information in this message is confidential and may be
>>>> legally privileged. It is intended solely for the addressee. Access to this
>>>> message by anyone else is unauthorized. If you are not the intended
>>>> recipient, any disclosure, copying, or distribution of the message, or any
>>>> action or omission taken by you in reliance on it, is prohibited and may be
>>>> unlawful. Please immediately contact the sender if you have received this
>>>> message in error. Further, this e-mail may contain viruses and all
>>>> reasonable precaution to minimize the risk arising there from is taken by
>>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>>> of any virus in this e-mail. All applicable virus checks should be carried
>>>> out by you before opening this e-mail or any attachment thereto.
>>>> Thank you - OnMobile Global Limited.
>>>>
>>>
>>>
>>>
>>> --
>>> Yours,
>>> Zheng
>>>
>>
>>
>>
>> --
>> Yours,
>> Zheng
>>
>> DISCLAIMER: The information in this message is confidential and may be
>> legally privileged. It is intended solely for the addressee. Access to this
>> message by anyone else is unauthorized. If you are not the intended
>> recipient, any disclosure, copying, or distribution of the message, or any
>> action or omission taken by you in reliance on it, is prohibited and may be
>> unlawful. Please immediately contact the sender if you have received this
>> message in error. Further, this e-mail may contain viruses and all
>> reasonable precaution to minimize the risk arising there from is taken by
>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>> of any virus in this e-mail. All applicable virus checks should be carried
>> out by you before opening this e-mail or any attachment thereto.
>> Thank you - OnMobile Global Limited.
>>
>>
>>
>>
>>
>> ________________________________
>>
>> DISCLAIMER: The information in this message is confidential and may be
>> legally privileged. It is intended solely for the addressee. Access to this
>> message by anyone else is unauthorized. If you are not the intended
>> recipient, any disclosure, copying, or distribution of the message, or any
>> action or omission taken by you in reliance on it, is prohibited and may be
>> unlawful. Please immediately contact the sender if you have received this
>> message in error. Further, this e-mail may contain viruses and all
>> reasonable precaution to minimize the risk arising there from is taken by
>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>> of any virus in this e-mail. All applicable virus checks should be carried
>> out by you before opening this e-mail or any attachment thereto.
>> Thank you - OnMobile Global Limited.
>>
>> ________________________________
>> DISCLAIMER: The information in this message is confidential and may be
>> legally privileged. It is intended solely for the addressee. Access to this
>> message by anyone else is unauthorized. If you are not the intended
>> recipient, any disclosure, copying, or distribution of the message, or any
>> action or omission taken by you in reliance on it, is prohibited and may be
>> unlawful. Please immediately contact the sender if you have received this
>> message in error. Further, this e-mail may contain viruses and all
>> reasonable precaution to minimize the risk arising there from is taken by
>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>> of any virus in this e-mail. All applicable virus checks should be carried
>> out by you before opening this e-mail or any attachment thereto.
>> Thank you - OnMobile Global Limited.
>>
>
>
>
> --
> Yours,
> Zheng


DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
Thank you - OnMobile Global Limited.

RE: Hive Installation Problem

Posted by Vidyasagar Venkata Nallapati <vi...@onmobile.com>.
Hi,

I am getting the problem while downloading the .pom file from

https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.0/core-0.20.0.pom
ERROR: Server access Error: Connection timed out url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.0/core-0.20.0.pom

Is this reporsitory changed?

Regards
Vidyasagar N V

-----Original Message-----
From: John Sichi [mailto:jsichi@facebook.com]
Sent: Saturday, February 06, 2010 7:39 AM
To: hive-user@hadoop.apache.org
Subject: Re: Hive Installation Problem

By the way, the current IVY_HOME detection in build-common.xml is broken because it doesn't do:

<property environment="env"/>

first.

I'll log a JIRA issue for it, but it seems there are other problems with it even after I fix that since the build is currenlty installing ivy under build/ivy rather than under ${ivy.home}; nothing else in build-common.xml references ivy.home.

JVS

On Feb 5, 2010, at 1:15 PM, Zheng Shao wrote:

> HI guys,
>
> Can you have a try to make the following directory the same as mine?
> Once this is done, remove the "build" directory, and run "ant package".
>
> Does this solve the problem?
>
>
>
> [zshao@dev ~/.ant] ls -lR
> .:
> total 3896
> drwxr-xr-x  2 zshao users    4096 Feb  5 13:04 apache-ivy-2.0.0-rc2
> -rw-r--r--  1 zshao users 3965953 Nov  4  2008 apache-ivy-2.0.0-rc2-bin.zip
> -rw-r--r--  1 zshao users       0 Feb  5 13:04 apache-ivy-2.0.0-rc2.installed
> drwxr-xr-x  3 zshao users    4096 Feb  5 13:07 cache
> drwxr-xr-x  2 zshao users    4096 Feb  5 13:04 lib
>
> ./apache-ivy-2.0.0-rc2:
> total 880
> -rw-r--r--  1 zshao users 893199 Oct 28  2008 ivy-2.0.0-rc2.jar
>
> ./cache:
> total 4
> drwxr-xr-x  3 zshao users 4096 Feb  4 19:30 hadoop
>
> ./cache/hadoop:
> total 4
> drwxr-xr-x  3 zshao users 4096 Feb  5 13:08 core
>
> ./cache/hadoop/core:
> total 4
> drwxr-xr-x  2 zshao users 4096 Feb  4 19:30 sources
>
> ./cache/hadoop/core/sources:
> total 127436
> -rw-r--r--  1 zshao users 14427013 Aug 20  2008 hadoop-0.17.2.1.tar.gz
> -rw-r--r--  1 zshao users 30705253 Jan 22  2009 hadoop-0.18.3.tar.gz
> -rw-r--r--  1 zshao users 42266180 Nov 13  2008 hadoop-0.19.0.tar.gz
> -rw-r--r--  1 zshao users 42813980 Apr  8  2009 hadoop-0.20.0.tar.gz
>
> ./lib:
> total 880
> -rw-r--r--  1 zshao users 893199 Feb  5 13:04 ivy-2.0.0-rc2.jar
>
>
> Zheng
>
> On Fri, Feb 5, 2010 at 5:49 AM, Vidyasagar Venkata Nallapati
> <vi...@onmobile.com> wrote:
>> Hi ,
>>
>>
>>
>> We are still getting the problem
>>
>>
>>
>> [ivy:retrieve] no resolved descriptor found: launching default resolve
>>
>> Overriding previous definition of property "ivy.version"
>>
>> [ivy:retrieve] using ivy parser to parse
>> file:/master/hadoop/hive/shims/ivy.xml
>>
>> [ivy:retrieve] :: resolving dependencies ::
>> org.apache.hadoop.hive#shims;working@ph1
>>
>> [ivy:retrieve]  confs: [default]
>>
>> [ivy:retrieve]  validate = true
>>
>> [ivy:retrieve]  refresh = false
>>
>> [ivy:retrieve] resolving dependencies for configuration 'default'
>>
>> [ivy:retrieve] == resolving dependencies for
>> org.apache.hadoop.hive#shims;working@ph1 [default]
>>
>> [ivy:retrieve] == resolving dependencies
>> org.apache.hadoop.hive#shims;working@ph1->hadoop#core;0.20.1 [default->*]
>>
>> [ivy:retrieve] default: Checking cache for: dependency: hadoop#core;0.20.1
>> {*=[*]}
>>
>> [ivy:retrieve]  hadoop-source: no ivy file nor artifact found for
>> hadoop#core;0.20.1
>>
>> [ivy:retrieve]          tried
>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>
>>
>>
>> And the .pom for this is not getting copied, please suggest something on
>> this.
>>
>>
>>
>> Regards
>>
>> Vidyasagar N V
>>
>>
>>
>> From: baburaj.S [mailto:baburaj.s@onmobile.com]
>> Sent: Friday, February 05, 2010 4:59 PM
>>
>> To: hive-user@hadoop.apache.org
>> Subject: RE: Hive Installation Problem
>>
>>
>>
>> No I don't have the variable defined. Any other things that I have to check.
>> Is this happening because I am trying for Hadoop 0.20.1
>>
>>
>>
>> Babu
>>
>>
>>
>>
>>
>> From: Carl Steinbach [mailto:carl@cloudera.com]
>> Sent: Friday, February 05, 2010 3:07 PM
>> To: hive-user@hadoop.apache.org
>> Subject: Re: Hive Installation Problem
>>
>>
>>
>> Hi Babu,
>>
>> ~/.ant/cache is the default Ivy cache directory for Hive, but if the
>> environment variable IVY_HOME
>> is set it will use $IVY_HOME/cache instead. Is it possible that you have
>> this environment
>> variable set to a value different than ~/.ant?
>>
>> On Fri, Feb 5, 2010 at 12:09 AM, baburaj.S <ba...@onmobile.com> wrote:
>>
>> I have tried the same but still the installation is giving the same error. I
>> don't know if it is looking in the cache . Can we make any change in
>> ivysettings.xml that it has to resolve the file from the file system rather
>> through an url.
>>
>> Babu
>>
>> -----Original Message-----
>> From: Zheng Shao [mailto:zshao9@gmail.com]
>> Sent: Friday, February 05, 2010 12:47 PM
>> To: hive-user@hadoop.apache.org
>> Subject: Re: Hive Installation Problem
>>
>> Added to http://wiki.apache.org/hadoop/Hive/FAQ
>>
>> Zheng
>>
>> On Thu, Feb 4, 2010 at 11:11 PM, Zheng Shao <zs...@gmail.com> wrote:
>>> Try this:
>>>
>>> cd ~/.ant/cache/hadoop/core/sources
>>> wget
>>> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>
>>>
>>> Zheng
>>>
>>> On Thu, Feb 4, 2010 at 10:23 PM, baburaj.S <ba...@onmobile.com> wrote:
>>>> Hello ,
>>>>
>>>> I am new to Hadoop and is trying to install Hive now. We have the
>>>> following setup at our side
>>>>
>>>> OS - Ubuntu 9.10
>>>> Hadoop - 0.20.1
>>>> Hive installation tried - 0.4.0 .
>>>>
>>>> The Hadoop is installed and is working fine . Now when we were installing
>>>> Hive I got error that it couldn't resolve the dependencies. I changed the
>>>> shims build and properties xml to make the dependencies look for Hadoop
>>>> 0.20.1 . But now when I call the ant script I get the following error
>>>>
>>>> ivy-retrieve-hadoop-source:
>>>> [ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 ::
>>>> http://ant.apache.org/ivy/ :
>>>> :: loading settings :: file = /master/hive/ivy/ivysettings.xml
>>>> [ivy:retrieve] :: resolving dependencies ::
>>>> org.apache.hadoop.hive#shims;working
>>>> [ivy:retrieve]  confs: [default]
>>>> [ivy:retrieve] :: resolution report :: resolve 953885ms :: artifacts dl
>>>> 0ms
>>>>
>>>> ---------------------------------------------------------------------
>>>>       |                  |            modules            ||   artifacts
>>>>  |
>>>>       |       conf       | number| search|dwnlded|evicted||
>>>> number|dwnlded|
>>>>
>>>> ---------------------------------------------------------------------
>>>>       |      default     |   1   |   0   |   0   |   0   ||   0   |   0
>>>>  |
>>>>
>>>> ---------------------------------------------------------------------
>>>> [ivy:retrieve]
>>>> [ivy:retrieve] :: problems summary ::
>>>> [ivy:retrieve] :::: WARNINGS
>>>> [ivy:retrieve]          module not found: hadoop#core;0.20.1
>>>> [ivy:retrieve]  ==== hadoop-source: tried
>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>> [ivy:retrieve]
>>>> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>> [ivy:retrieve]  ==== apache-snapshot: tried
>>>> [ivy:retrieve]
>>>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>> [ivy:retrieve]
>>>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>>>> [ivy:retrieve]  ==== maven2: tried
>>>> [ivy:retrieve]
>>>> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>> [ivy:retrieve]
>>>> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>> [ivy:retrieve]          ::          UNRESOLVED DEPENDENCIES         ::
>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>> [ivy:retrieve]          :: hadoop#core;0.20.1: not found
>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>> [ivy:retrieve] :::: ERRORS
>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>> url=http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>> url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>> url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>> url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>> url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>>>> [ivy:retrieve]
>>>> [ivy:retrieve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
>>>>
>>>> BUILD FAILED
>>>> /master/hive/build.xml:148: The following error occurred while executing
>>>> this line:
>>>> /master/hive/build.xml:93: The following error occurred while executing
>>>> this line:
>>>> /master/hive/shims/build.xml:64: The following error occurred while
>>>> executing this line:
>>>> /master/hive/build-common.xml:172: impossible to resolve dependencies:
>>>>       resolve failed - see output for details
>>>>
>>>> Total time: 15 minutes 55 seconds
>>>>
>>>>
>>>> I have even tried to download hadoop-0.20.1.tar.gz and put it in the ant
>>>> cache of the user . Still the same error is repeated. I am stuck and not
>>>> able to install it .
>>>>
>>>> Any help on the above will be greatly appreciated.
>>>>
>>>> Babu
>>>>
>>>>
>>>> DISCLAIMER: The information in this message is confidential and may be
>>>> legally privileged. It is intended solely for the addressee. Access to this
>>>> message by anyone else is unauthorized. If you are not the intended
>>>> recipient, any disclosure, copying, or distribution of the message, or any
>>>> action or omission taken by you in reliance on it, is prohibited and may be
>>>> unlawful. Please immediately contact the sender if you have received this
>>>> message in error. Further, this e-mail may contain viruses and all
>>>> reasonable precaution to minimize the risk arising there from is taken by
>>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>>> of any virus in this e-mail. All applicable virus checks should be carried
>>>> out by you before opening this e-mail or any attachment thereto.
>>>> Thank you - OnMobile Global Limited.
>>>>
>>>
>>>
>>>
>>> --
>>> Yours,
>>> Zheng
>>>
>>
>>
>>
>> --
>> Yours,
>> Zheng
>>
>> DISCLAIMER: The information in this message is confidential and may be
>> legally privileged. It is intended solely for the addressee. Access to this
>> message by anyone else is unauthorized. If you are not the intended
>> recipient, any disclosure, copying, or distribution of the message, or any
>> action or omission taken by you in reliance on it, is prohibited and may be
>> unlawful. Please immediately contact the sender if you have received this
>> message in error. Further, this e-mail may contain viruses and all
>> reasonable precaution to minimize the risk arising there from is taken by
>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>> of any virus in this e-mail. All applicable virus checks should be carried
>> out by you before opening this e-mail or any attachment thereto.
>> Thank you - OnMobile Global Limited.
>>
>>
>>
>>
>>
>> ________________________________
>>
>> DISCLAIMER: The information in this message is confidential and may be
>> legally privileged. It is intended solely for the addressee. Access to this
>> message by anyone else is unauthorized. If you are not the intended
>> recipient, any disclosure, copying, or distribution of the message, or any
>> action or omission taken by you in reliance on it, is prohibited and may be
>> unlawful. Please immediately contact the sender if you have received this
>> message in error. Further, this e-mail may contain viruses and all
>> reasonable precaution to minimize the risk arising there from is taken by
>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>> of any virus in this e-mail. All applicable virus checks should be carried
>> out by you before opening this e-mail or any attachment thereto.
>> Thank you - OnMobile Global Limited.
>>
>> ________________________________
>> DISCLAIMER: The information in this message is confidential and may be
>> legally privileged. It is intended solely for the addressee. Access to this
>> message by anyone else is unauthorized. If you are not the intended
>> recipient, any disclosure, copying, or distribution of the message, or any
>> action or omission taken by you in reliance on it, is prohibited and may be
>> unlawful. Please immediately contact the sender if you have received this
>> message in error. Further, this e-mail may contain viruses and all
>> reasonable precaution to minimize the risk arising there from is taken by
>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>> of any virus in this e-mail. All applicable virus checks should be carried
>> out by you before opening this e-mail or any attachment thereto.
>> Thank you - OnMobile Global Limited.
>>
>
>
>
> --
> Yours,
> Zheng


DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
Thank you - OnMobile Global Limited.

Re: Hive Installation Problem

Posted by John Sichi <js...@facebook.com>.
By the way, the current IVY_HOME detection in build-common.xml is broken because it doesn't do:

<property environment="env"/>

first.

I'll log a JIRA issue for it, but it seems there are other problems with it even after I fix that since the build is currenlty installing ivy under build/ivy rather than under ${ivy.home}; nothing else in build-common.xml references ivy.home.

JVS

On Feb 5, 2010, at 1:15 PM, Zheng Shao wrote:

> HI guys,
>
> Can you have a try to make the following directory the same as mine?
> Once this is done, remove the "build" directory, and run "ant package".
>
> Does this solve the problem?
>
>
>
> [zshao@dev ~/.ant] ls -lR
> .:
> total 3896
> drwxr-xr-x  2 zshao users    4096 Feb  5 13:04 apache-ivy-2.0.0-rc2
> -rw-r--r--  1 zshao users 3965953 Nov  4  2008 apache-ivy-2.0.0-rc2-bin.zip
> -rw-r--r--  1 zshao users       0 Feb  5 13:04 apache-ivy-2.0.0-rc2.installed
> drwxr-xr-x  3 zshao users    4096 Feb  5 13:07 cache
> drwxr-xr-x  2 zshao users    4096 Feb  5 13:04 lib
>
> ./apache-ivy-2.0.0-rc2:
> total 880
> -rw-r--r--  1 zshao users 893199 Oct 28  2008 ivy-2.0.0-rc2.jar
>
> ./cache:
> total 4
> drwxr-xr-x  3 zshao users 4096 Feb  4 19:30 hadoop
>
> ./cache/hadoop:
> total 4
> drwxr-xr-x  3 zshao users 4096 Feb  5 13:08 core
>
> ./cache/hadoop/core:
> total 4
> drwxr-xr-x  2 zshao users 4096 Feb  4 19:30 sources
>
> ./cache/hadoop/core/sources:
> total 127436
> -rw-r--r--  1 zshao users 14427013 Aug 20  2008 hadoop-0.17.2.1.tar.gz
> -rw-r--r--  1 zshao users 30705253 Jan 22  2009 hadoop-0.18.3.tar.gz
> -rw-r--r--  1 zshao users 42266180 Nov 13  2008 hadoop-0.19.0.tar.gz
> -rw-r--r--  1 zshao users 42813980 Apr  8  2009 hadoop-0.20.0.tar.gz
>
> ./lib:
> total 880
> -rw-r--r--  1 zshao users 893199 Feb  5 13:04 ivy-2.0.0-rc2.jar
>
>
> Zheng
>
> On Fri, Feb 5, 2010 at 5:49 AM, Vidyasagar Venkata Nallapati
> <vi...@onmobile.com> wrote:
>> Hi ,
>>
>>
>>
>> We are still getting the problem
>>
>>
>>
>> [ivy:retrieve] no resolved descriptor found: launching default resolve
>>
>> Overriding previous definition of property "ivy.version"
>>
>> [ivy:retrieve] using ivy parser to parse
>> file:/master/hadoop/hive/shims/ivy.xml
>>
>> [ivy:retrieve] :: resolving dependencies ::
>> org.apache.hadoop.hive#shims;working@ph1
>>
>> [ivy:retrieve]  confs: [default]
>>
>> [ivy:retrieve]  validate = true
>>
>> [ivy:retrieve]  refresh = false
>>
>> [ivy:retrieve] resolving dependencies for configuration 'default'
>>
>> [ivy:retrieve] == resolving dependencies for
>> org.apache.hadoop.hive#shims;working@ph1 [default]
>>
>> [ivy:retrieve] == resolving dependencies
>> org.apache.hadoop.hive#shims;working@ph1->hadoop#core;0.20.1 [default->*]
>>
>> [ivy:retrieve] default: Checking cache for: dependency: hadoop#core;0.20.1
>> {*=[*]}
>>
>> [ivy:retrieve]  hadoop-source: no ivy file nor artifact found for
>> hadoop#core;0.20.1
>>
>> [ivy:retrieve]          tried
>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>
>>
>>
>> And the .pom for this is not getting copied, please suggest something on
>> this.
>>
>>
>>
>> Regards
>>
>> Vidyasagar N V
>>
>>
>>
>> From: baburaj.S [mailto:baburaj.s@onmobile.com]
>> Sent: Friday, February 05, 2010 4:59 PM
>>
>> To: hive-user@hadoop.apache.org
>> Subject: RE: Hive Installation Problem
>>
>>
>>
>> No I don’t have the variable defined. Any other things that I have to check.
>> Is this happening because I am trying for Hadoop 0.20.1
>>
>>
>>
>> Babu
>>
>>
>>
>>
>>
>> From: Carl Steinbach [mailto:carl@cloudera.com]
>> Sent: Friday, February 05, 2010 3:07 PM
>> To: hive-user@hadoop.apache.org
>> Subject: Re: Hive Installation Problem
>>
>>
>>
>> Hi Babu,
>>
>> ~/.ant/cache is the default Ivy cache directory for Hive, but if the
>> environment variable IVY_HOME
>> is set it will use $IVY_HOME/cache instead. Is it possible that you have
>> this environment
>> variable set to a value different than ~/.ant?
>>
>> On Fri, Feb 5, 2010 at 12:09 AM, baburaj.S <ba...@onmobile.com> wrote:
>>
>> I have tried the same but still the installation is giving the same error. I
>> don't know if it is looking in the cache . Can we make any change in
>> ivysettings.xml that it has to resolve the file from the file system rather
>> through an url.
>>
>> Babu
>>
>> -----Original Message-----
>> From: Zheng Shao [mailto:zshao9@gmail.com]
>> Sent: Friday, February 05, 2010 12:47 PM
>> To: hive-user@hadoop.apache.org
>> Subject: Re: Hive Installation Problem
>>
>> Added to http://wiki.apache.org/hadoop/Hive/FAQ
>>
>> Zheng
>>
>> On Thu, Feb 4, 2010 at 11:11 PM, Zheng Shao <zs...@gmail.com> wrote:
>>> Try this:
>>>
>>> cd ~/.ant/cache/hadoop/core/sources
>>> wget
>>> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>
>>>
>>> Zheng
>>>
>>> On Thu, Feb 4, 2010 at 10:23 PM, baburaj.S <ba...@onmobile.com> wrote:
>>>> Hello ,
>>>>
>>>> I am new to Hadoop and is trying to install Hive now. We have the
>>>> following setup at our side
>>>>
>>>> OS - Ubuntu 9.10
>>>> Hadoop - 0.20.1
>>>> Hive installation tried - 0.4.0 .
>>>>
>>>> The Hadoop is installed and is working fine . Now when we were installing
>>>> Hive I got error that it couldn't resolve the dependencies. I changed the
>>>> shims build and properties xml to make the dependencies look for Hadoop
>>>> 0.20.1 . But now when I call the ant script I get the following error
>>>>
>>>> ivy-retrieve-hadoop-source:
>>>> [ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 ::
>>>> http://ant.apache.org/ivy/ :
>>>> :: loading settings :: file = /master/hive/ivy/ivysettings.xml
>>>> [ivy:retrieve] :: resolving dependencies ::
>>>> org.apache.hadoop.hive#shims;working
>>>> [ivy:retrieve]  confs: [default]
>>>> [ivy:retrieve] :: resolution report :: resolve 953885ms :: artifacts dl
>>>> 0ms
>>>>
>>>> ---------------------------------------------------------------------
>>>>       |                  |            modules            ||   artifacts
>>>>  |
>>>>       |       conf       | number| search|dwnlded|evicted||
>>>> number|dwnlded|
>>>>
>>>> ---------------------------------------------------------------------
>>>>       |      default     |   1   |   0   |   0   |   0   ||   0   |   0
>>>>  |
>>>>
>>>> ---------------------------------------------------------------------
>>>> [ivy:retrieve]
>>>> [ivy:retrieve] :: problems summary ::
>>>> [ivy:retrieve] :::: WARNINGS
>>>> [ivy:retrieve]          module not found: hadoop#core;0.20.1
>>>> [ivy:retrieve]  ==== hadoop-source: tried
>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>> [ivy:retrieve]
>>>> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>> [ivy:retrieve]  ==== apache-snapshot: tried
>>>> [ivy:retrieve]
>>>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>> [ivy:retrieve]
>>>> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>>>> [ivy:retrieve]  ==== maven2: tried
>>>> [ivy:retrieve]
>>>> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>>> [ivy:retrieve]
>>>> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>> [ivy:retrieve]          ::          UNRESOLVED DEPENDENCIES         ::
>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>> [ivy:retrieve]          :: hadoop#core;0.20.1: not found
>>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>>> [ivy:retrieve] :::: ERRORS
>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>> url=http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>> url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>> url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>> url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>>>> [ivy:retrieve]  Server access Error: Connection timed out
>>>> url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>>>> [ivy:retrieve]
>>>> [ivy:retrieve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
>>>>
>>>> BUILD FAILED
>>>> /master/hive/build.xml:148: The following error occurred while executing
>>>> this line:
>>>> /master/hive/build.xml:93: The following error occurred while executing
>>>> this line:
>>>> /master/hive/shims/build.xml:64: The following error occurred while
>>>> executing this line:
>>>> /master/hive/build-common.xml:172: impossible to resolve dependencies:
>>>>       resolve failed - see output for details
>>>>
>>>> Total time: 15 minutes 55 seconds
>>>>
>>>>
>>>> I have even tried to download hadoop-0.20.1.tar.gz and put it in the ant
>>>> cache of the user . Still the same error is repeated. I am stuck and not
>>>> able to install it .
>>>>
>>>> Any help on the above will be greatly appreciated.
>>>>
>>>> Babu
>>>>
>>>>
>>>> DISCLAIMER: The information in this message is confidential and may be
>>>> legally privileged. It is intended solely for the addressee. Access to this
>>>> message by anyone else is unauthorized. If you are not the intended
>>>> recipient, any disclosure, copying, or distribution of the message, or any
>>>> action or omission taken by you in reliance on it, is prohibited and may be
>>>> unlawful. Please immediately contact the sender if you have received this
>>>> message in error. Further, this e-mail may contain viruses and all
>>>> reasonable precaution to minimize the risk arising there from is taken by
>>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>>> of any virus in this e-mail. All applicable virus checks should be carried
>>>> out by you before opening this e-mail or any attachment thereto.
>>>> Thank you - OnMobile Global Limited.
>>>>
>>>
>>>
>>>
>>> --
>>> Yours,
>>> Zheng
>>>
>>
>>
>>
>> --
>> Yours,
>> Zheng
>>
>> DISCLAIMER: The information in this message is confidential and may be
>> legally privileged. It is intended solely for the addressee. Access to this
>> message by anyone else is unauthorized. If you are not the intended
>> recipient, any disclosure, copying, or distribution of the message, or any
>> action or omission taken by you in reliance on it, is prohibited and may be
>> unlawful. Please immediately contact the sender if you have received this
>> message in error. Further, this e-mail may contain viruses and all
>> reasonable precaution to minimize the risk arising there from is taken by
>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>> of any virus in this e-mail. All applicable virus checks should be carried
>> out by you before opening this e-mail or any attachment thereto.
>> Thank you - OnMobile Global Limited.
>>
>>
>>
>>
>>
>> ________________________________
>>
>> DISCLAIMER: The information in this message is confidential and may be
>> legally privileged. It is intended solely for the addressee. Access to this
>> message by anyone else is unauthorized. If you are not the intended
>> recipient, any disclosure, copying, or distribution of the message, or any
>> action or omission taken by you in reliance on it, is prohibited and may be
>> unlawful. Please immediately contact the sender if you have received this
>> message in error. Further, this e-mail may contain viruses and all
>> reasonable precaution to minimize the risk arising there from is taken by
>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>> of any virus in this e-mail. All applicable virus checks should be carried
>> out by you before opening this e-mail or any attachment thereto.
>> Thank you - OnMobile Global Limited.
>>
>> ________________________________
>> DISCLAIMER: The information in this message is confidential and may be
>> legally privileged. It is intended solely for the addressee. Access to this
>> message by anyone else is unauthorized. If you are not the intended
>> recipient, any disclosure, copying, or distribution of the message, or any
>> action or omission taken by you in reliance on it, is prohibited and may be
>> unlawful. Please immediately contact the sender if you have received this
>> message in error. Further, this e-mail may contain viruses and all
>> reasonable precaution to minimize the risk arising there from is taken by
>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>> of any virus in this e-mail. All applicable virus checks should be carried
>> out by you before opening this e-mail or any attachment thereto.
>> Thank you - OnMobile Global Limited.
>>
>
>
>
> --
> Yours,
> Zheng


Re: Hive Installation Problem

Posted by Zheng Shao <zs...@gmail.com>.
HI guys,

Can you have a try to make the following directory the same as mine?
Once this is done, remove the "build" directory, and run "ant package".

Does this solve the problem?



[zshao@dev ~/.ant] ls -lR
.:
total 3896
drwxr-xr-x  2 zshao users    4096 Feb  5 13:04 apache-ivy-2.0.0-rc2
-rw-r--r--  1 zshao users 3965953 Nov  4  2008 apache-ivy-2.0.0-rc2-bin.zip
-rw-r--r--  1 zshao users       0 Feb  5 13:04 apache-ivy-2.0.0-rc2.installed
drwxr-xr-x  3 zshao users    4096 Feb  5 13:07 cache
drwxr-xr-x  2 zshao users    4096 Feb  5 13:04 lib

./apache-ivy-2.0.0-rc2:
total 880
-rw-r--r--  1 zshao users 893199 Oct 28  2008 ivy-2.0.0-rc2.jar

./cache:
total 4
drwxr-xr-x  3 zshao users 4096 Feb  4 19:30 hadoop

./cache/hadoop:
total 4
drwxr-xr-x  3 zshao users 4096 Feb  5 13:08 core

./cache/hadoop/core:
total 4
drwxr-xr-x  2 zshao users 4096 Feb  4 19:30 sources

./cache/hadoop/core/sources:
total 127436
-rw-r--r--  1 zshao users 14427013 Aug 20  2008 hadoop-0.17.2.1.tar.gz
-rw-r--r--  1 zshao users 30705253 Jan 22  2009 hadoop-0.18.3.tar.gz
-rw-r--r--  1 zshao users 42266180 Nov 13  2008 hadoop-0.19.0.tar.gz
-rw-r--r--  1 zshao users 42813980 Apr  8  2009 hadoop-0.20.0.tar.gz

./lib:
total 880
-rw-r--r--  1 zshao users 893199 Feb  5 13:04 ivy-2.0.0-rc2.jar


Zheng

On Fri, Feb 5, 2010 at 5:49 AM, Vidyasagar Venkata Nallapati
<vi...@onmobile.com> wrote:
> Hi ,
>
>
>
> We are still getting the problem
>
>
>
> [ivy:retrieve] no resolved descriptor found: launching default resolve
>
> Overriding previous definition of property "ivy.version"
>
> [ivy:retrieve] using ivy parser to parse
> file:/master/hadoop/hive/shims/ivy.xml
>
> [ivy:retrieve] :: resolving dependencies ::
> org.apache.hadoop.hive#shims;working@ph1
>
> [ivy:retrieve]  confs: [default]
>
> [ivy:retrieve]  validate = true
>
> [ivy:retrieve]  refresh = false
>
> [ivy:retrieve] resolving dependencies for configuration 'default'
>
> [ivy:retrieve] == resolving dependencies for
> org.apache.hadoop.hive#shims;working@ph1 [default]
>
> [ivy:retrieve] == resolving dependencies
> org.apache.hadoop.hive#shims;working@ph1->hadoop#core;0.20.1 [default->*]
>
> [ivy:retrieve] default: Checking cache for: dependency: hadoop#core;0.20.1
> {*=[*]}
>
> [ivy:retrieve]  hadoop-source: no ivy file nor artifact found for
> hadoop#core;0.20.1
>
> [ivy:retrieve]          tried
> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>
>
>
> And the .pom for this is not getting copied, please suggest something on
> this.
>
>
>
> Regards
>
> Vidyasagar N V
>
>
>
> From: baburaj.S [mailto:baburaj.s@onmobile.com]
> Sent: Friday, February 05, 2010 4:59 PM
>
> To: hive-user@hadoop.apache.org
> Subject: RE: Hive Installation Problem
>
>
>
> No I don’t have the variable defined. Any other things that I have to check.
> Is this happening because I am trying for Hadoop 0.20.1
>
>
>
> Babu
>
>
>
>
>
> From: Carl Steinbach [mailto:carl@cloudera.com]
> Sent: Friday, February 05, 2010 3:07 PM
> To: hive-user@hadoop.apache.org
> Subject: Re: Hive Installation Problem
>
>
>
> Hi Babu,
>
> ~/.ant/cache is the default Ivy cache directory for Hive, but if the
> environment variable IVY_HOME
> is set it will use $IVY_HOME/cache instead. Is it possible that you have
> this environment
> variable set to a value different than ~/.ant?
>
> On Fri, Feb 5, 2010 at 12:09 AM, baburaj.S <ba...@onmobile.com> wrote:
>
> I have tried the same but still the installation is giving the same error. I
> don't know if it is looking in the cache . Can we make any change in
> ivysettings.xml that it has to resolve the file from the file system rather
> through an url.
>
> Babu
>
> -----Original Message-----
> From: Zheng Shao [mailto:zshao9@gmail.com]
> Sent: Friday, February 05, 2010 12:47 PM
> To: hive-user@hadoop.apache.org
> Subject: Re: Hive Installation Problem
>
> Added to http://wiki.apache.org/hadoop/Hive/FAQ
>
> Zheng
>
> On Thu, Feb 4, 2010 at 11:11 PM, Zheng Shao <zs...@gmail.com> wrote:
>> Try this:
>>
>> cd ~/.ant/cache/hadoop/core/sources
>> wget
>> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>
>>
>> Zheng
>>
>> On Thu, Feb 4, 2010 at 10:23 PM, baburaj.S <ba...@onmobile.com> wrote:
>>> Hello ,
>>>
>>> I am new to Hadoop and is trying to install Hive now. We have the
>>> following setup at our side
>>>
>>> OS - Ubuntu 9.10
>>> Hadoop - 0.20.1
>>> Hive installation tried - 0.4.0 .
>>>
>>> The Hadoop is installed and is working fine . Now when we were installing
>>> Hive I got error that it couldn't resolve the dependencies. I changed the
>>> shims build and properties xml to make the dependencies look for Hadoop
>>> 0.20.1 . But now when I call the ant script I get the following error
>>>
>>> ivy-retrieve-hadoop-source:
>>> [ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 ::
>>> http://ant.apache.org/ivy/ :
>>> :: loading settings :: file = /master/hive/ivy/ivysettings.xml
>>> [ivy:retrieve] :: resolving dependencies ::
>>> org.apache.hadoop.hive#shims;working
>>> [ivy:retrieve]  confs: [default]
>>> [ivy:retrieve] :: resolution report :: resolve 953885ms :: artifacts dl
>>> 0ms
>>>
>>>  ---------------------------------------------------------------------
>>>        |                  |            modules            ||   artifacts
>>>   |
>>>        |       conf       | number| search|dwnlded|evicted||
>>> number|dwnlded|
>>>
>>>  ---------------------------------------------------------------------
>>>        |      default     |   1   |   0   |   0   |   0   ||   0   |   0
>>>   |
>>>
>>>  ---------------------------------------------------------------------
>>> [ivy:retrieve]
>>> [ivy:retrieve] :: problems summary ::
>>> [ivy:retrieve] :::: WARNINGS
>>> [ivy:retrieve]          module not found: hadoop#core;0.20.1
>>> [ivy:retrieve]  ==== hadoop-source: tried
>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>> [ivy:retrieve]
>>>  http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>> [ivy:retrieve]  ==== apache-snapshot: tried
>>> [ivy:retrieve]
>>>  https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>> [ivy:retrieve]
>>>  https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>>> [ivy:retrieve]  ==== maven2: tried
>>> [ivy:retrieve]
>>>  http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>>> [ivy:retrieve]
>>>  http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>> [ivy:retrieve]          ::          UNRESOLVED DEPENDENCIES         ::
>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>> [ivy:retrieve]          :: hadoop#core;0.20.1: not found
>>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>>> [ivy:retrieve] :::: ERRORS
>>> [ivy:retrieve]  Server access Error: Connection timed out
>>> url=http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>>> [ivy:retrieve]  Server access Error: Connection timed out
>>> url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>>> [ivy:retrieve]  Server access Error: Connection timed out
>>> url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>>> [ivy:retrieve]  Server access Error: Connection timed out
>>> url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>>> [ivy:retrieve]  Server access Error: Connection timed out
>>> url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>>> [ivy:retrieve]
>>> [ivy:retrieve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
>>>
>>> BUILD FAILED
>>> /master/hive/build.xml:148: The following error occurred while executing
>>> this line:
>>> /master/hive/build.xml:93: The following error occurred while executing
>>> this line:
>>> /master/hive/shims/build.xml:64: The following error occurred while
>>> executing this line:
>>> /master/hive/build-common.xml:172: impossible to resolve dependencies:
>>>        resolve failed - see output for details
>>>
>>> Total time: 15 minutes 55 seconds
>>>
>>>
>>> I have even tried to download hadoop-0.20.1.tar.gz and put it in the ant
>>> cache of the user . Still the same error is repeated. I am stuck and not
>>> able to install it .
>>>
>>> Any help on the above will be greatly appreciated.
>>>
>>> Babu
>>>
>>>
>>> DISCLAIMER: The information in this message is confidential and may be
>>> legally privileged. It is intended solely for the addressee. Access to this
>>> message by anyone else is unauthorized. If you are not the intended
>>> recipient, any disclosure, copying, or distribution of the message, or any
>>> action or omission taken by you in reliance on it, is prohibited and may be
>>> unlawful. Please immediately contact the sender if you have received this
>>> message in error. Further, this e-mail may contain viruses and all
>>> reasonable precaution to minimize the risk arising there from is taken by
>>> OnMobile. OnMobile is not liable for any damage sustained by you as a result
>>> of any virus in this e-mail. All applicable virus checks should be carried
>>> out by you before opening this e-mail or any attachment thereto.
>>> Thank you - OnMobile Global Limited.
>>>
>>
>>
>>
>> --
>> Yours,
>> Zheng
>>
>
>
>
> --
> Yours,
> Zheng
>
> DISCLAIMER: The information in this message is confidential and may be
> legally privileged. It is intended solely for the addressee. Access to this
> message by anyone else is unauthorized. If you are not the intended
> recipient, any disclosure, copying, or distribution of the message, or any
> action or omission taken by you in reliance on it, is prohibited and may be
> unlawful. Please immediately contact the sender if you have received this
> message in error. Further, this e-mail may contain viruses and all
> reasonable precaution to minimize the risk arising there from is taken by
> OnMobile. OnMobile is not liable for any damage sustained by you as a result
> of any virus in this e-mail. All applicable virus checks should be carried
> out by you before opening this e-mail or any attachment thereto.
> Thank you - OnMobile Global Limited.
>
>
>
>
>
> ________________________________
>
> DISCLAIMER: The information in this message is confidential and may be
> legally privileged. It is intended solely for the addressee. Access to this
> message by anyone else is unauthorized. If you are not the intended
> recipient, any disclosure, copying, or distribution of the message, or any
> action or omission taken by you in reliance on it, is prohibited and may be
> unlawful. Please immediately contact the sender if you have received this
> message in error. Further, this e-mail may contain viruses and all
> reasonable precaution to minimize the risk arising there from is taken by
> OnMobile. OnMobile is not liable for any damage sustained by you as a result
> of any virus in this e-mail. All applicable virus checks should be carried
> out by you before opening this e-mail or any attachment thereto.
> Thank you - OnMobile Global Limited.
>
> ________________________________
> DISCLAIMER: The information in this message is confidential and may be
> legally privileged. It is intended solely for the addressee. Access to this
> message by anyone else is unauthorized. If you are not the intended
> recipient, any disclosure, copying, or distribution of the message, or any
> action or omission taken by you in reliance on it, is prohibited and may be
> unlawful. Please immediately contact the sender if you have received this
> message in error. Further, this e-mail may contain viruses and all
> reasonable precaution to minimize the risk arising there from is taken by
> OnMobile. OnMobile is not liable for any damage sustained by you as a result
> of any virus in this e-mail. All applicable virus checks should be carried
> out by you before opening this e-mail or any attachment thereto.
> Thank you - OnMobile Global Limited.
>



-- 
Yours,
Zheng

RE: Hive Installation Problem

Posted by Vidyasagar Venkata Nallapati <vi...@onmobile.com>.
Hi ,

We are still getting the problem

[ivy:retrieve] no resolved descriptor found: launching default resolve
Overriding previous definition of property "ivy.version"
[ivy:retrieve] using ivy parser to parse file:/master/hadoop/hive/shims/ivy.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working@ph1
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  validate = true
[ivy:retrieve]  refresh = false
[ivy:retrieve] resolving dependencies for configuration 'default'
[ivy:retrieve] == resolving dependencies for org.apache.hadoop.hive#shims;working@ph1 [default]
[ivy:retrieve] == resolving dependencies org.apache.hadoop.hive#shims;working@ph1->hadoop#core;0.20.1 [default->*]
[ivy:retrieve] default: Checking cache for: dependency: hadoop#core;0.20.1 {*=[*]}
[ivy:retrieve]  hadoop-source: no ivy file nor artifact found for hadoop#core;0.20.1
[ivy:retrieve]          tried https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom

And the .pom for this is not getting copied, please suggest something on this.

Regards
Vidyasagar N V

From: baburaj.S [mailto:baburaj.s@onmobile.com]
Sent: Friday, February 05, 2010 4:59 PM
To: hive-user@hadoop.apache.org
Subject: RE: Hive Installation Problem

No I don't have the variable defined. Any other things that I have to check. Is this happening because I am trying for Hadoop 0.20.1

Babu


From: Carl Steinbach [mailto:carl@cloudera.com]
Sent: Friday, February 05, 2010 3:07 PM
To: hive-user@hadoop.apache.org
Subject: Re: Hive Installation Problem

Hi Babu,

~/.ant/cache is the default Ivy cache directory for Hive, but if the environment variable IVY_HOME
is set it will use $IVY_HOME/cache instead. Is it possible that you have this environment
variable set to a value different than ~/.ant?
On Fri, Feb 5, 2010 at 12:09 AM, baburaj.S <ba...@onmobile.com>> wrote:
I have tried the same but still the installation is giving the same error. I don't know if it is looking in the cache . Can we make any change in ivysettings.xml that it has to resolve the file from the file system rather through an url.

Babu


-----Original Message-----
From: Zheng Shao [mailto:zshao9@gmail.com<ma...@gmail.com>]
Sent: Friday, February 05, 2010 12:47 PM
To: hive-user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Hive Installation Problem

Added to http://wiki.apache.org/hadoop/Hive/FAQ

Zheng

On Thu, Feb 4, 2010 at 11:11 PM, Zheng Shao <zs...@gmail.com>> wrote:
> Try this:
>
> cd ~/.ant/cache/hadoop/core/sources
> wget http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>
>
> Zheng
>
> On Thu, Feb 4, 2010 at 10:23 PM, baburaj.S <ba...@onmobile.com>> wrote:
>> Hello ,
>>
>> I am new to Hadoop and is trying to install Hive now. We have the following setup at our side
>>
>> OS - Ubuntu 9.10
>> Hadoop - 0.20.1
>> Hive installation tried - 0.4.0 .
>>
>> The Hadoop is installed and is working fine . Now when we were installing Hive I got error that it couldn't resolve the dependencies. I changed the shims build and properties xml to make the dependencies look for Hadoop 0.20.1 . But now when I call the ant script I get the following error
>>
>> ivy-retrieve-hadoop-source:
>> [ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 :: http://ant.apache.org/ivy/ :
>> :: loading settings :: file = /master/hive/ivy/ivysettings.xml
>> [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working
>> [ivy:retrieve]  confs: [default]
>> [ivy:retrieve] :: resolution report :: resolve 953885ms :: artifacts dl 0ms
>>        ---------------------------------------------------------------------
>>        |                  |            modules            ||   artifacts   |
>>        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
>>        ---------------------------------------------------------------------
>>        |      default     |   1   |   0   |   0   |   0   ||   0   |   0   |
>>        ---------------------------------------------------------------------
>> [ivy:retrieve]
>> [ivy:retrieve] :: problems summary ::
>> [ivy:retrieve] :::: WARNINGS
>> [ivy:retrieve]          module not found: hadoop#core;0.20.1
>> [ivy:retrieve]  ==== hadoop-source: tried
>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>> [ivy:retrieve]    http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>> [ivy:retrieve]  ==== apache-snapshot: tried
>> [ivy:retrieve]    https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>> [ivy:retrieve]    https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>> [ivy:retrieve]  ==== maven2: tried
>> [ivy:retrieve]    http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>> [ivy:retrieve]    http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>> [ivy:retrieve]          ::          UNRESOLVED DEPENDENCIES         ::
>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>> [ivy:retrieve]          :: hadoop#core;0.20.1: not found
>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>> [ivy:retrieve] :::: ERRORS
>> [ivy:retrieve]  Server access Error: Connection timed out url=http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>> [ivy:retrieve]  Server access Error: Connection timed out url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>> [ivy:retrieve]  Server access Error: Connection timed out url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>> [ivy:retrieve]  Server access Error: Connection timed out url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>> [ivy:retrieve]  Server access Error: Connection timed out url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>> [ivy:retrieve]
>> [ivy:retrieve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
>>
>> BUILD FAILED
>> /master/hive/build.xml:148: The following error occurred while executing this line:
>> /master/hive/build.xml:93: The following error occurred while executing this line:
>> /master/hive/shims/build.xml:64: The following error occurred while executing this line:
>> /master/hive/build-common.xml:172: impossible to resolve dependencies:
>>        resolve failed - see output for details
>>
>> Total time: 15 minutes 55 seconds
>>
>>
>> I have even tried to download hadoop-0.20.1.tar.gz and put it in the ant cache of the user . Still the same error is repeated. I am stuck and not able to install it .
>>
>> Any help on the above will be greatly appreciated.
>>
>> Babu
>>
>>
>> DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
>> Thank you - OnMobile Global Limited.
>>
>
>
>
> --
> Yours,
> Zheng
>



--
Yours,
Zheng

DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
Thank you - OnMobile Global Limited.


________________________________
DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
Thank you - OnMobile Global Limited.

________________________________
DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
Thank you - OnMobile Global Limited.

RE: Hive Installation Problem

Posted by "baburaj.S" <ba...@onmobile.com>.
No I don't have the variable defined. Any other things that I have to check. Is this happening because I am trying for Hadoop 0.20.1

Babu


From: Carl Steinbach [mailto:carl@cloudera.com]
Sent: Friday, February 05, 2010 3:07 PM
To: hive-user@hadoop.apache.org
Subject: Re: Hive Installation Problem

Hi Babu,

~/.ant/cache is the default Ivy cache directory for Hive, but if the environment variable IVY_HOME
is set it will use $IVY_HOME/cache instead. Is it possible that you have this environment
variable set to a value different than ~/.ant?
On Fri, Feb 5, 2010 at 12:09 AM, baburaj.S <ba...@onmobile.com>> wrote:
I have tried the same but still the installation is giving the same error. I don't know if it is looking in the cache . Can we make any change in ivysettings.xml that it has to resolve the file from the file system rather through an url.

Babu


-----Original Message-----
From: Zheng Shao [mailto:zshao9@gmail.com<ma...@gmail.com>]
Sent: Friday, February 05, 2010 12:47 PM
To: hive-user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Hive Installation Problem

Added to http://wiki.apache.org/hadoop/Hive/FAQ

Zheng

On Thu, Feb 4, 2010 at 11:11 PM, Zheng Shao <zs...@gmail.com>> wrote:
> Try this:
>
> cd ~/.ant/cache/hadoop/core/sources
> wget http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>
>
> Zheng
>
> On Thu, Feb 4, 2010 at 10:23 PM, baburaj.S <ba...@onmobile.com>> wrote:
>> Hello ,
>>
>> I am new to Hadoop and is trying to install Hive now. We have the following setup at our side
>>
>> OS - Ubuntu 9.10
>> Hadoop - 0.20.1
>> Hive installation tried - 0.4.0 .
>>
>> The Hadoop is installed and is working fine . Now when we were installing Hive I got error that it couldn't resolve the dependencies. I changed the shims build and properties xml to make the dependencies look for Hadoop 0.20.1 . But now when I call the ant script I get the following error
>>
>> ivy-retrieve-hadoop-source:
>> [ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 :: http://ant.apache.org/ivy/ :
>> :: loading settings :: file = /master/hive/ivy/ivysettings.xml
>> [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working
>> [ivy:retrieve]  confs: [default]
>> [ivy:retrieve] :: resolution report :: resolve 953885ms :: artifacts dl 0ms
>>        ---------------------------------------------------------------------
>>        |                  |            modules            ||   artifacts   |
>>        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
>>        ---------------------------------------------------------------------
>>        |      default     |   1   |   0   |   0   |   0   ||   0   |   0   |
>>        ---------------------------------------------------------------------
>> [ivy:retrieve]
>> [ivy:retrieve] :: problems summary ::
>> [ivy:retrieve] :::: WARNINGS
>> [ivy:retrieve]          module not found: hadoop#core;0.20.1
>> [ivy:retrieve]  ==== hadoop-source: tried
>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>> [ivy:retrieve]    http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>> [ivy:retrieve]  ==== apache-snapshot: tried
>> [ivy:retrieve]    https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>> [ivy:retrieve]    https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>> [ivy:retrieve]  ==== maven2: tried
>> [ivy:retrieve]    http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>> [ivy:retrieve]    http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>> [ivy:retrieve]          ::          UNRESOLVED DEPENDENCIES         ::
>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>> [ivy:retrieve]          :: hadoop#core;0.20.1: not found
>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>> [ivy:retrieve] :::: ERRORS
>> [ivy:retrieve]  Server access Error: Connection timed out url=http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>> [ivy:retrieve]  Server access Error: Connection timed out url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>> [ivy:retrieve]  Server access Error: Connection timed out url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>> [ivy:retrieve]  Server access Error: Connection timed out url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>> [ivy:retrieve]  Server access Error: Connection timed out url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>> [ivy:retrieve]
>> [ivy:retrieve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
>>
>> BUILD FAILED
>> /master/hive/build.xml:148: The following error occurred while executing this line:
>> /master/hive/build.xml:93: The following error occurred while executing this line:
>> /master/hive/shims/build.xml:64: The following error occurred while executing this line:
>> /master/hive/build-common.xml:172: impossible to resolve dependencies:
>>        resolve failed - see output for details
>>
>> Total time: 15 minutes 55 seconds
>>
>>
>> I have even tried to download hadoop-0.20.1.tar.gz and put it in the ant cache of the user . Still the same error is repeated. I am stuck and not able to install it .
>>
>> Any help on the above will be greatly appreciated.
>>
>> Babu
>>
>>
>> DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
>> Thank you - OnMobile Global Limited.
>>
>
>
>
> --
> Yours,
> Zheng
>



--
Yours,
Zheng

DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
Thank you - OnMobile Global Limited.


________________________________
DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
Thank you - OnMobile Global Limited.

Re: Hive Installation Problem

Posted by Carl Steinbach <ca...@cloudera.com>.
Hi Babu,

~/.ant/cache is the default Ivy cache directory for Hive, but if the
environment variable IVY_HOME
is set it will use $IVY_HOME/cache instead. Is it possible that you have
this environment
variable set to a value different than ~/.ant?

On Fri, Feb 5, 2010 at 12:09 AM, baburaj.S <ba...@onmobile.com> wrote:

> I have tried the same but still the installation is giving the same error.
> I don't know if it is looking in the cache . Can we make any change in
> ivysettings.xml that it has to resolve the file from the file system rather
> through an url.
>
> Babu
>
>
> -----Original Message-----
> From: Zheng Shao [mailto:zshao9@gmail.com]
> Sent: Friday, February 05, 2010 12:47 PM
> To: hive-user@hadoop.apache.org
> Subject: Re: Hive Installation Problem
>
> Added to http://wiki.apache.org/hadoop/Hive/FAQ
>
> Zheng
>
> On Thu, Feb 4, 2010 at 11:11 PM, Zheng Shao <zs...@gmail.com> wrote:
> > Try this:
> >
> > cd ~/.ant/cache/hadoop/core/sources
> > wget
> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
> >
> >
> > Zheng
> >
> > On Thu, Feb 4, 2010 at 10:23 PM, baburaj.S <ba...@onmobile.com>
> wrote:
> >> Hello ,
> >>
> >> I am new to Hadoop and is trying to install Hive now. We have the
> following setup at our side
> >>
> >> OS - Ubuntu 9.10
> >> Hadoop - 0.20.1
> >> Hive installation tried - 0.4.0 .
> >>
> >> The Hadoop is installed and is working fine . Now when we were
> installing Hive I got error that it couldn't resolve the dependencies. I
> changed the shims build and properties xml to make the dependencies look for
> Hadoop 0.20.1 . But now when I call the ant script I get the following error
> >>
> >> ivy-retrieve-hadoop-source:
> >> [ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 ::
> http://ant.apache.org/ivy/ :
> >> :: loading settings :: file = /master/hive/ivy/ivysettings.xml
> >> [ivy:retrieve] :: resolving dependencies ::
> org.apache.hadoop.hive#shims;working
> >> [ivy:retrieve]  confs: [default]
> >> [ivy:retrieve] :: resolution report :: resolve 953885ms :: artifacts dl
> 0ms
> >>
>  ---------------------------------------------------------------------
> >>        |                  |            modules            ||   artifacts
>   |
> >>        |       conf       | number| search|dwnlded|evicted||
> number|dwnlded|
> >>
>  ---------------------------------------------------------------------
> >>        |      default     |   1   |   0   |   0   |   0   ||   0   |   0
>   |
> >>
>  ---------------------------------------------------------------------
> >> [ivy:retrieve]
> >> [ivy:retrieve] :: problems summary ::
> >> [ivy:retrieve] :::: WARNINGS
> >> [ivy:retrieve]          module not found: hadoop#core;0.20.1
> >> [ivy:retrieve]  ==== hadoop-source: tried
> >> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
> >> [ivy:retrieve]
> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
> >> [ivy:retrieve]  ==== apache-snapshot: tried
> >> [ivy:retrieve]
> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
> >> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
> >> [ivy:retrieve]
> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
> >> [ivy:retrieve]  ==== maven2: tried
> >> [ivy:retrieve]
> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
> >> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
> >> [ivy:retrieve]
> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
> >> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
> >> [ivy:retrieve]          ::          UNRESOLVED DEPENDENCIES         ::
> >> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
> >> [ivy:retrieve]          :: hadoop#core;0.20.1: not found
> >> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
> >> [ivy:retrieve] :::: ERRORS
> >> [ivy:retrieve]  Server access Error: Connection timed out url=
> http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
> >> [ivy:retrieve]  Server access Error: Connection timed out url=
> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
> >> [ivy:retrieve]  Server access Error: Connection timed out url=
> https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
> >> [ivy:retrieve]  Server access Error: Connection timed out url=
> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
> >> [ivy:retrieve]  Server access Error: Connection timed out url=
> http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
> >> [ivy:retrieve]
> >> [ivy:retrieve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
> >>
> >> BUILD FAILED
> >> /master/hive/build.xml:148: The following error occurred while executing
> this line:
> >> /master/hive/build.xml:93: The following error occurred while executing
> this line:
> >> /master/hive/shims/build.xml:64: The following error occurred while
> executing this line:
> >> /master/hive/build-common.xml:172: impossible to resolve dependencies:
> >>        resolve failed - see output for details
> >>
> >> Total time: 15 minutes 55 seconds
> >>
> >>
> >> I have even tried to download hadoop-0.20.1.tar.gz and put it in the ant
> cache of the user . Still the same error is repeated. I am stuck and not
> able to install it .
> >>
> >> Any help on the above will be greatly appreciated.
> >>
> >> Babu
> >>
> >>
> >> DISCLAIMER: The information in this message is confidential and may be
> legally privileged. It is intended solely for the addressee. Access to this
> message by anyone else is unauthorized. If you are not the intended
> recipient, any disclosure, copying, or distribution of the message, or any
> action or omission taken by you in reliance on it, is prohibited and may be
> unlawful. Please immediately contact the sender if you have received this
> message in error. Further, this e-mail may contain viruses and all
> reasonable precaution to minimize the risk arising there from is taken by
> OnMobile. OnMobile is not liable for any damage sustained by you as a result
> of any virus in this e-mail. All applicable virus checks should be carried
> out by you before opening this e-mail or any attachment thereto.
> >> Thank you - OnMobile Global Limited.
> >>
> >
> >
> >
> > --
> > Yours,
> > Zheng
> >
>
>
>
> --
> Yours,
> Zheng
>
> DISCLAIMER: The information in this message is confidential and may be
> legally privileged. It is intended solely for the addressee. Access to this
> message by anyone else is unauthorized. If you are not the intended
> recipient, any disclosure, copying, or distribution of the message, or any
> action or omission taken by you in reliance on it, is prohibited and may be
> unlawful. Please immediately contact the sender if you have received this
> message in error. Further, this e-mail may contain viruses and all
> reasonable precaution to minimize the risk arising there from is taken by
> OnMobile. OnMobile is not liable for any damage sustained by you as a result
> of any virus in this e-mail. All applicable virus checks should be carried
> out by you before opening this e-mail or any attachment thereto.
> Thank you - OnMobile Global Limited.
>

RE: Hive Installation Problem

Posted by "baburaj.S" <ba...@onmobile.com>.
I have tried the same but still the installation is giving the same error. I don't know if it is looking in the cache . Can we make any change in ivysettings.xml that it has to resolve the file from the file system rather through an url.

Babu


-----Original Message-----
From: Zheng Shao [mailto:zshao9@gmail.com]
Sent: Friday, February 05, 2010 12:47 PM
To: hive-user@hadoop.apache.org
Subject: Re: Hive Installation Problem

Added to http://wiki.apache.org/hadoop/Hive/FAQ

Zheng

On Thu, Feb 4, 2010 at 11:11 PM, Zheng Shao <zs...@gmail.com> wrote:
> Try this:
>
> cd ~/.ant/cache/hadoop/core/sources
> wget http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>
>
> Zheng
>
> On Thu, Feb 4, 2010 at 10:23 PM, baburaj.S <ba...@onmobile.com> wrote:
>> Hello ,
>>
>> I am new to Hadoop and is trying to install Hive now. We have the following setup at our side
>>
>> OS - Ubuntu 9.10
>> Hadoop - 0.20.1
>> Hive installation tried - 0.4.0 .
>>
>> The Hadoop is installed and is working fine . Now when we were installing Hive I got error that it couldn't resolve the dependencies. I changed the shims build and properties xml to make the dependencies look for Hadoop 0.20.1 . But now when I call the ant script I get the following error
>>
>> ivy-retrieve-hadoop-source:
>> [ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 :: http://ant.apache.org/ivy/ :
>> :: loading settings :: file = /master/hive/ivy/ivysettings.xml
>> [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working
>> [ivy:retrieve]  confs: [default]
>> [ivy:retrieve] :: resolution report :: resolve 953885ms :: artifacts dl 0ms
>>        ---------------------------------------------------------------------
>>        |                  |            modules            ||   artifacts   |
>>        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
>>        ---------------------------------------------------------------------
>>        |      default     |   1   |   0   |   0   |   0   ||   0   |   0   |
>>        ---------------------------------------------------------------------
>> [ivy:retrieve]
>> [ivy:retrieve] :: problems summary ::
>> [ivy:retrieve] :::: WARNINGS
>> [ivy:retrieve]          module not found: hadoop#core;0.20.1
>> [ivy:retrieve]  ==== hadoop-source: tried
>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>> [ivy:retrieve]    http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>> [ivy:retrieve]  ==== apache-snapshot: tried
>> [ivy:retrieve]    https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>> [ivy:retrieve]    https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>> [ivy:retrieve]  ==== maven2: tried
>> [ivy:retrieve]    http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>> [ivy:retrieve]    http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>> [ivy:retrieve]          ::          UNRESOLVED DEPENDENCIES         ::
>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>> [ivy:retrieve]          :: hadoop#core;0.20.1: not found
>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>> [ivy:retrieve] :::: ERRORS
>> [ivy:retrieve]  Server access Error: Connection timed out url=http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>> [ivy:retrieve]  Server access Error: Connection timed out url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>> [ivy:retrieve]  Server access Error: Connection timed out url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>> [ivy:retrieve]  Server access Error: Connection timed out url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>> [ivy:retrieve]  Server access Error: Connection timed out url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>> [ivy:retrieve]
>> [ivy:retrieve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
>>
>> BUILD FAILED
>> /master/hive/build.xml:148: The following error occurred while executing this line:
>> /master/hive/build.xml:93: The following error occurred while executing this line:
>> /master/hive/shims/build.xml:64: The following error occurred while executing this line:
>> /master/hive/build-common.xml:172: impossible to resolve dependencies:
>>        resolve failed - see output for details
>>
>> Total time: 15 minutes 55 seconds
>>
>>
>> I have even tried to download hadoop-0.20.1.tar.gz and put it in the ant cache of the user . Still the same error is repeated. I am stuck and not able to install it .
>>
>> Any help on the above will be greatly appreciated.
>>
>> Babu
>>
>>
>> DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
>> Thank you - OnMobile Global Limited.
>>
>
>
>
> --
> Yours,
> Zheng
>



--
Yours,
Zheng

DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
Thank you - OnMobile Global Limited.

Re: Hive Installation Problem

Posted by Zheng Shao <zs...@gmail.com>.
Added to http://wiki.apache.org/hadoop/Hive/FAQ

Zheng

On Thu, Feb 4, 2010 at 11:11 PM, Zheng Shao <zs...@gmail.com> wrote:
> Try this:
>
> cd ~/.ant/cache/hadoop/core/sources
> wget http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>
>
> Zheng
>
> On Thu, Feb 4, 2010 at 10:23 PM, baburaj.S <ba...@onmobile.com> wrote:
>> Hello ,
>>
>> I am new to Hadoop and is trying to install Hive now. We have the following setup at our side
>>
>> OS - Ubuntu 9.10
>> Hadoop - 0.20.1
>> Hive installation tried - 0.4.0 .
>>
>> The Hadoop is installed and is working fine . Now when we were installing Hive I got error that it couldn't resolve the dependencies. I changed the shims build and properties xml to make the dependencies look for Hadoop 0.20.1 . But now when I call the ant script I get the following error
>>
>> ivy-retrieve-hadoop-source:
>> [ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 :: http://ant.apache.org/ivy/ :
>> :: loading settings :: file = /master/hive/ivy/ivysettings.xml
>> [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#shims;working
>> [ivy:retrieve]  confs: [default]
>> [ivy:retrieve] :: resolution report :: resolve 953885ms :: artifacts dl 0ms
>>        ---------------------------------------------------------------------
>>        |                  |            modules            ||   artifacts   |
>>        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
>>        ---------------------------------------------------------------------
>>        |      default     |   1   |   0   |   0   |   0   ||   0   |   0   |
>>        ---------------------------------------------------------------------
>> [ivy:retrieve]
>> [ivy:retrieve] :: problems summary ::
>> [ivy:retrieve] :::: WARNINGS
>> [ivy:retrieve]          module not found: hadoop#core;0.20.1
>> [ivy:retrieve]  ==== hadoop-source: tried
>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>> [ivy:retrieve]    http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>> [ivy:retrieve]  ==== apache-snapshot: tried
>> [ivy:retrieve]    https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>> [ivy:retrieve]    https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>> [ivy:retrieve]  ==== maven2: tried
>> [ivy:retrieve]    http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>> [ivy:retrieve]    -- artifact hadoop#core;0.20.1!hadoop.tar.gz(source):
>> [ivy:retrieve]    http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>> [ivy:retrieve]          ::          UNRESOLVED DEPENDENCIES         ::
>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>> [ivy:retrieve]          :: hadoop#core;0.20.1: not found
>> [ivy:retrieve]          ::::::::::::::::::::::::::::::::::::::::::::::
>> [ivy:retrieve] :::: ERRORS
>> [ivy:retrieve]  Server access Error: Connection timed out url=http://archive.apache.org/dist/hadoop/core/hadoop-0.20.1/hadoop-0.20.1.tar.gz
>> [ivy:retrieve]  Server access Error: Connection timed out url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/core-0.20.1.pom
>> [ivy:retrieve]  Server access Error: Connection timed out url=https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.1/hadoop-0.20.1.tar.gz
>> [ivy:retrieve]  Server access Error: Connection timed out url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.pom
>> [ivy:retrieve]  Server access Error: Connection timed out url=http://repo1.maven.org/maven2/hadoop/core/0.20.1/core-0.20.1.tar.gz
>> [ivy:retrieve]
>> [ivy:retrieve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
>>
>> BUILD FAILED
>> /master/hive/build.xml:148: The following error occurred while executing this line:
>> /master/hive/build.xml:93: The following error occurred while executing this line:
>> /master/hive/shims/build.xml:64: The following error occurred while executing this line:
>> /master/hive/build-common.xml:172: impossible to resolve dependencies:
>>        resolve failed - see output for details
>>
>> Total time: 15 minutes 55 seconds
>>
>>
>> I have even tried to download hadoop-0.20.1.tar.gz and put it in the ant cache of the user . Still the same error is repeated. I am stuck and not able to install it .
>>
>> Any help on the above will be greatly appreciated.
>>
>> Babu
>>
>>
>> DISCLAIMER: The information in this message is confidential and may be legally privileged. It is intended solely for the addressee. Access to this message by anyone else is unauthorized. If you are not the intended recipient, any disclosure, copying, or distribution of the message, or any action or omission taken by you in reliance on it, is prohibited and may be unlawful. Please immediately contact the sender if you have received this message in error. Further, this e-mail may contain viruses and all reasonable precaution to minimize the risk arising there from is taken by OnMobile. OnMobile is not liable for any damage sustained by you as a result of any virus in this e-mail. All applicable virus checks should be carried out by you before opening this e-mail or any attachment thereto.
>> Thank you - OnMobile Global Limited.
>>
>
>
>
> --
> Yours,
> Zheng
>



-- 
Yours,
Zheng