You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by Vasilis Liaskovitis <vl...@gmail.com> on 2009/11/29 20:05:17 UTC

build and use hadoop-git

Hi,

how can I build and use hadoop-git?
The project has recently been split into 3 repositories hadoop-common,
hadoop-hdfs and hadoop-mapred. It's not clear to me how to
build/compile and use the git/tip for the whole framework. E.g. would
building all jars from the 3 subprojects (and copying them under the
same directory) work?
Is there a guide/wiki page out there for this? Or perhaps there is
another repository which still has a centralized trunk for all
subprojects?
thanks in advance,

- Vasilis

Re: build and use hadoop-git

Posted by Steve Loughran <st...@apache.org>.
Kay Kay wrote:
> Start with hadoop-common to start building .
> 
> hadoop-hdfs / hadoop-mapred pull the dependencies from apache snapshot 
> repository that contains the nightlies of last successful builds so in 
> theory all 3 could be built independently because of the respective 
> snapshots being present in apache snapshot repository.
> 
> If you do want to make cross-project changes and test them -
> * create a local ivy resolver and
> * place it before the apache snapshot in the ivy settings .
> * publish the jar for a given project to the directory pointed by the 
> ivy resolver in step 1
> * clear ivy cache
> * recompile.

I think you just need to tweak build.properties to use the internal 
resolver.

This is what I have symlinked into -common, -hdfs and -mapred

#This is symlinked across projects!
patch.version=1
resolvers=internal
#you can increment this number as you see fit
version=0.22.0-alpha-1
hadoop.version=${version}
hadoop-core.version=${version}
hadoop-hdfs.version=${version}
hadoop-mapred.version=${version}


Re: build and use hadoop-git

Posted by Aaron Kimball <aa...@cloudera.com>.
See http://wiki.apache.org/hadoop/HowToContribute for more step-by-step
instructions.
- Aaron

On Fri, Jan 22, 2010 at 7:36 PM, Kay Kay <ka...@gmail.com> wrote:

> Start with hadoop-common to start building .
>
> hadoop-hdfs / hadoop-mapred pull the dependencies from apache snapshot
> repository that contains the nightlies of last successful builds so in
> theory all 3 could be built independently because of the respective
> snapshots being present in apache snapshot repository.
>
> If you do want to make cross-project changes and test them -
> * create a local ivy resolver and
> * place it before the apache snapshot in the ivy settings .
> * publish the jar for a given project to the directory pointed by the ivy
> resolver in step 1
> * clear ivy cache
> * recompile.
>
>
>
>
>
> On 11/29/09 11:05 AM, Vasilis Liaskovitis wrote:
>
>> Hi,
>>
>> how can I build and use hadoop-git?
>> The project has recently been split into 3 repositories hadoop-common,
>> hadoop-hdfs and hadoop-mapred. It's not clear to me how to
>> build/compile and use the git/tip for the whole framework. E.g. would
>> building all jars from the 3 subprojects (and copying them under the
>> same directory) work?
>> Is there a guide/wiki page out there for this? Or perhaps there is
>> another repository which still has a centralized trunk for all
>> subprojects?
>> thanks in advance,
>>
>> - Vasilis
>>
>>
>
>

Re: build and use hadoop-git

Posted by Aaron Kimball <aa...@cloudera.com>.
See http://wiki.apache.org/hadoop/HowToContribute for more step-by-step
instructions.
- Aaron

On Fri, Jan 22, 2010 at 7:36 PM, Kay Kay <ka...@gmail.com> wrote:

> Start with hadoop-common to start building .
>
> hadoop-hdfs / hadoop-mapred pull the dependencies from apache snapshot
> repository that contains the nightlies of last successful builds so in
> theory all 3 could be built independently because of the respective
> snapshots being present in apache snapshot repository.
>
> If you do want to make cross-project changes and test them -
> * create a local ivy resolver and
> * place it before the apache snapshot in the ivy settings .
> * publish the jar for a given project to the directory pointed by the ivy
> resolver in step 1
> * clear ivy cache
> * recompile.
>
>
>
>
>
> On 11/29/09 11:05 AM, Vasilis Liaskovitis wrote:
>
>> Hi,
>>
>> how can I build and use hadoop-git?
>> The project has recently been split into 3 repositories hadoop-common,
>> hadoop-hdfs and hadoop-mapred. It's not clear to me how to
>> build/compile and use the git/tip for the whole framework. E.g. would
>> building all jars from the 3 subprojects (and copying them under the
>> same directory) work?
>> Is there a guide/wiki page out there for this? Or perhaps there is
>> another repository which still has a centralized trunk for all
>> subprojects?
>> thanks in advance,
>>
>> - Vasilis
>>
>>
>
>

Re: build and use hadoop-git

Posted by Kay Kay <ka...@gmail.com>.
Start with hadoop-common to start building .

hadoop-hdfs / hadoop-mapred pull the dependencies from apache snapshot 
repository that contains the nightlies of last successful builds so in 
theory all 3 could be built independently because of the respective 
snapshots being present in apache snapshot repository.

If you do want to make cross-project changes and test them -
* create a local ivy resolver and
* place it before the apache snapshot in the ivy settings .
* publish the jar for a given project to the directory pointed by the 
ivy resolver in step 1
* clear ivy cache
* recompile.




On 11/29/09 11:05 AM, Vasilis Liaskovitis wrote:
> Hi,
>
> how can I build and use hadoop-git?
> The project has recently been split into 3 repositories hadoop-common,
> hadoop-hdfs and hadoop-mapred. It's not clear to me how to
> build/compile and use the git/tip for the whole framework. E.g. would
> building all jars from the 3 subprojects (and copying them under the
> same directory) work?
> Is there a guide/wiki page out there for this? Or perhaps there is
> another repository which still has a centralized trunk for all
> subprojects?
> thanks in advance,
>
> - Vasilis
>    


Re: build and use hadoop-git

Posted by Kay Kay <ka...@gmail.com>.
Start with hadoop-common to start building .

hadoop-hdfs / hadoop-mapred pull the dependencies from apache snapshot 
repository that contains the nightlies of last successful builds so in 
theory all 3 could be built independently because of the respective 
snapshots being present in apache snapshot repository.

If you do want to make cross-project changes and test them -
* create a local ivy resolver and
* place it before the apache snapshot in the ivy settings .
* publish the jar for a given project to the directory pointed by the 
ivy resolver in step 1
* clear ivy cache
* recompile.




On 11/29/09 11:05 AM, Vasilis Liaskovitis wrote:
> Hi,
>
> how can I build and use hadoop-git?
> The project has recently been split into 3 repositories hadoop-common,
> hadoop-hdfs and hadoop-mapred. It's not clear to me how to
> build/compile and use the git/tip for the whole framework. E.g. would
> building all jars from the 3 subprojects (and copying them under the
> same directory) work?
> Is there a guide/wiki page out there for this? Or perhaps there is
> another repository which still has a centralized trunk for all
> subprojects?
> thanks in advance,
>
> - Vasilis
>    


Re: Hadoop on Sun Solaris

Posted by Jochen Frey <jo...@scoutlabs.com>.
Rajendra,

Hadoop works fine on solaris. We have had it in production on solaris  
for a number of months now.

Good luck!

Best,
Jochen

On Nov 30, 2009, at 6:56, "Palikala, Rajendra (CCL)" <RPalikala@carnival.com 
 > wrote:

>
> Can I build Hadoop on Sun Solaris. The documentation says it is only  
> supported on Linux, Open Solaris and on Windows for Dev purposes. I  
> want to build a prototype on Hadoop on our existing OS which is Sun  
> Solaris. This is purely for proto-typing purposes only. As Hadoop is  
> completely written in Java, I think I can install on Hadoop. Please  
> advise.
>
> Thanks,
> Rajendra

Re: Hadoop on Sun Solaris

Posted by Jochen Frey <jo...@scoutlabs.com>.
Rajendra,

Hadoop works fine on solaris. We have had it in production on solaris  
for a number of months now.

Good luck!

Best,
Jochen

On Nov 30, 2009, at 6:56, "Palikala, Rajendra (CCL)" <RPalikala@carnival.com 
 > wrote:

>
> Can I build Hadoop on Sun Solaris. The documentation says it is only  
> supported on Linux, Open Solaris and on Windows for Dev purposes. I  
> want to build a prototype on Hadoop on our existing OS which is Sun  
> Solaris. This is purely for proto-typing purposes only. As Hadoop is  
> completely written in Java, I think I can install on Hadoop. Please  
> advise.
>
> Thanks,
> Rajendra

Re: Hadoop on Sun Solaris

Posted by Jason Venner <ja...@gmail.com>.
I had to hard code the os name in the build.xml file to get the native
compression codec shared libraries to build for hadoop 19

On Mon, Nov 30, 2009 at 7:09 AM, Daniel Templeton <Da...@sun.com>wrote:

> I'm using it on Solaris without any problem.  Of course, I'm just using the
> provided JAR files.  As long as you have all the right pieces, e.g. ant,
> javac, the libraries, you should be able to build.
>
> Daniel
>
>
> Palikala, Rajendra (CCL) wrote:
>
>> Can I build Hadoop on Sun Solaris. The documentation says it is only
>> supported on Linux, Open Solaris and on Windows for Dev purposes. I want to
>> build a prototype on Hadoop on our existing OS which is Sun Solaris. This is
>> purely for proto-typing purposes only. As Hadoop is completely written in
>> Java, I think I can install on Hadoop. Please advise.
>>
>> Thanks,
>> Rajendra
>>
>
>


-- 
Pro Hadoop, a book to guide you from beginner to hadoop mastery,
http://www.amazon.com/dp/1430219424?tag=jewlerymall
www.prohadoopbook.com a community for Hadoop Professionals

Re: Hadoop on Sun Solaris

Posted by Daniel Templeton <Da...@Sun.COM>.
I'm using it on Solaris without any problem.  Of course, I'm just using 
the provided JAR files.  As long as you have all the right pieces, e.g. 
ant, javac, the libraries, you should be able to build.

Daniel

Palikala, Rajendra (CCL) wrote:
> Can I build Hadoop on Sun Solaris. The documentation says it is only supported on Linux, Open Solaris and on Windows for Dev purposes. I want to build a prototype on Hadoop on our existing OS which is Sun Solaris. This is purely for proto-typing purposes only. As Hadoop is completely written in Java, I think I can install on Hadoop. Please advise.
>
> Thanks,
> Rajendra 
>   


Re: Hadoop on Sun Solaris

Posted by Daniel Templeton <Da...@Sun.COM>.
I'm using it on Solaris without any problem.  Of course, I'm just using 
the provided JAR files.  As long as you have all the right pieces, e.g. 
ant, javac, the libraries, you should be able to build.

Daniel

Palikala, Rajendra (CCL) wrote:
> Can I build Hadoop on Sun Solaris. The documentation says it is only supported on Linux, Open Solaris and on Windows for Dev purposes. I want to build a prototype on Hadoop on our existing OS which is Sun Solaris. This is purely for proto-typing purposes only. As Hadoop is completely written in Java, I think I can install on Hadoop. Please advise.
>
> Thanks,
> Rajendra 
>   


Re: Hadoop on Sun Solaris

Posted by BM <bo...@gmail.com>.
On Wed, Dec 2, 2009 at 12:36 AM, Edward Capriolo <ed...@gmail.com> wrote:
> What is so confusing about putting Java in the name of every product? :)
...and use JNI. I hear you. :-)

-- 
Kind regards, BM

Things, that are stupid at the beginning, rarely ends up wisely.

Re: Hadoop on Sun Solaris

Posted by Edward Capriolo <ed...@gmail.com>.
On Tue, Dec 1, 2009 at 10:18 AM, BM <bo...@gmail.com> wrote:
> On Mon, Nov 30, 2009 at 11:56 PM, Palikala, Rajendra (CCL)
> <RP...@carnival.com> wrote:
>> Can I build Hadoop on Sun Solaris.
> Yes. Besides, you could try yourself instead (literally 15 minutes to
> make it clear)... :-)
>
>> The documentation says it is only supported on [...] Open Solaris
> Same thing, just newer (SunOS 5.11). Get Java 6 for appropriate
> platform and that's it. It is just like Sun in their "best" traditions
> continue to make a complete mess with their brands and thus
> professionally confuse people as much as possible... I really hate
> that.
>
> --
> Kind regards, BM
>
> Things, that are stupid at the beginning, rarely ends up wisely.
>

What is so confusing about putting Java in the name of every product? :)

Re: Hadoop on Sun Solaris

Posted by BM <bo...@gmail.com>.
On Mon, Nov 30, 2009 at 11:56 PM, Palikala, Rajendra (CCL)
<RP...@carnival.com> wrote:
> Can I build Hadoop on Sun Solaris.
Yes. Besides, you could try yourself instead (literally 15 minutes to
make it clear)... :-)

> The documentation says it is only supported on [...] Open Solaris
Same thing, just newer (SunOS 5.11). Get Java 6 for appropriate
platform and that's it. It is just like Sun in their "best" traditions
continue to make a complete mess with their brands and thus
professionally confuse people as much as possible... I really hate
that.

-- 
Kind regards, BM

Things, that are stupid at the beginning, rarely ends up wisely.

Hadoop on Sun Solaris

Posted by "Palikala, Rajendra (CCL)" <RP...@carnival.com>.
Can I build Hadoop on Sun Solaris. The documentation says it is only supported on Linux, Open Solaris and on Windows for Dev purposes. I want to build a prototype on Hadoop on our existing OS which is Sun Solaris. This is purely for proto-typing purposes only. As Hadoop is completely written in Java, I think I can install on Hadoop. Please advise.

Thanks,
Rajendra 

Hadoop on Sun Solaris

Posted by "Palikala, Rajendra (CCL)" <RP...@carnival.com>.
Can I build Hadoop on Sun Solaris. The documentation says it is only supported on Linux, Open Solaris and on Windows for Dev purposes. I want to build a prototype on Hadoop on our existing OS which is Sun Solaris. This is purely for proto-typing purposes only. As Hadoop is completely written in Java, I think I can install on Hadoop. Please advise.

Thanks,
Rajendra