You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Karim Awara <ka...@kaust.edu.sa> on 2013/10/06 13:41:10 UTC

Hadoop 2.x with Eclipse

Hi,

I followed the instructions on how to import hadoop files to Eclipse (I am
using hadoop 2.1 beta).

Currently on my machine, I have hadoop 2.1 installed.. and its source code
is imported on Eclipse. What I can't grasp is   how to proceed from there?

I want to modify HDFS code (blockplacement strategy).. Now building hdfs
project via generates errors to me (unresolved types in hadoop common).
and if i built successfully, how to test my modified code?

--
Best Regards,
Karim Ahmed Awara

-- 

------------------------------
This message and its contents, including attachments are intended solely 
for the original recipient. If you are not the intended recipient or have 
received this message in error, please notify me immediately and delete 
this message from your computer system. Any unauthorized use or 
distribution is prohibited. Please consider the environment before printing 
this email.

Re: Hadoop 2.x with Eclipse

Posted by Karim Awara <ka...@kaust.edu.sa>.
Also my problem when importing the compiled hadoop to eclipse is that it
does not identify certain types  such as  "AvroRecord" that is used in
hadoop-common project and  is declared at this path
"/hadoop-common-project/hadoop-common/src/test/avro/avroRecord.avsc".. so
somehow Eclipse does not have an idea about this file which generates an
error build from eclipse as "a unit test".

--
Best Regards,
Karim Ahmed Awara


On Mon, Nov 4, 2013 at 8:09 AM, Karim Awara <ka...@kaust.edu.sa>wrote:

>
> The thing is.. when I downloaded the source code and compiled it with
> maven. There exist  no configuration files to configure. So I assume maven
> has its own way of test the unit tests... or am I missing something?
>
> --
> Best Regards,
> Karim Ahmed Awara
>
>
> On Mon, Nov 4, 2013 at 7:53 AM, Tsuyoshi OZAWA <oz...@gmail.com>wrote:
>
>> Instead of Ted's approach, it's also useful to use surefire plugin
>> when you debug tests.
>>
>> mvn test -Dmaven.surefire.debug -Dtest=TestClassName
>>
>> This commands accept debugger's attach on 5005 port by default, so you
>> can attach via eclipse's debugger. Then the test runs and you can use
>> debugger. I think the source code is needed to be compiled in your
>> local environment instead of just downloading it from hadoop's
>> release.
>>
>> Thanks,
>> Tsuyoshi
>>
>> On Sun, Nov 3, 2013 at 11:22 PM, Karim Awara <ka...@kaust.edu.sa>
>> wrote:
>> > Hi guys,
>> >
>> > Can I just install the HDFS project and debug it? (assuming I am
>> running a
>> > <put> command through the command line). If so, which project should I
>> > download (hadoop project that has hdfs)?
>> >
>> > --
>> > Best Regards,
>> > Karim Ahmed Awara
>> >
>> >
>> > On Sun, Oct 6, 2013 at 5:40 PM, Ted Yu <yu...@gmail.com> wrote:
>> >>
>> >> Karim:
>> >> If you want to debug unit tests, using Eclipse is a viable approach.
>> >> Here is what I did the past week debugging certain part of hadoop
>> >> (JobSubmitter in particular) through an HBase unit test.
>> >>
>> >> Run 'mvn install -DskipTests' to install hadoop locally
>> >> Open the class you want to debug and place breakpoint at proper
>> location
>> >> Open unit test which depends on the class above and select Debug As ->
>> >> JUnit Test
>> >> When breakpoint hits, associate the sources.jar file in local maven
>> repo
>> >> with the class. In my case, the sources jar file is located under
>> >>
>> ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT
>> >>
>> >> You should be able to step through hadoop code as usual at this point.
>> >>
>> >> Cheers
>> >>
>> >>
>> >> On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>
>> >> wrote:
>> >>>
>> >>> Karim,
>> >>>
>> >>>
>> >>>
>> >>> I am not an experienced Hadoop programmer, but what I found was that
>> >>> building and debugging Hadoop under Eclipse was very difficult, and I
>> was
>> >>> never to make it work correctly.  I suggest using the well documented
>> >>> command-line Maven build, installing Hadoop from that build, and
>> running it
>> >>> normally.  Once you have that working, run your namemode or datanode
>> daemon
>> >>> so as to wait for a remote debugger attach before starting.  You
>> should also
>> >>> get comfortable with log4j, the logging framework used by Hadoop, as
>> those
>> >>> log files are often your best friend when trying to debug a
>> collection of
>> >>> services.
>> >>>
>> >>>
>> >>>
>> >>> john
>> >>>
>> >>>
>> >>>
>> >>> From: Karim Awara [mailto:karim.awara@kaust.edu.sa]
>> >>> Sent: Sunday, October 06, 2013 5:41 AM
>> >>> To: user
>> >>> Subject: Hadoop 2.x with Eclipse
>> >>>
>> >>>
>> >>>
>> >>> Hi,
>> >>>
>> >>> I followed the instructions on how to import hadoop files to Eclipse
>> (I
>> >>> am using hadoop 2.1 beta).
>> >>>
>> >>> Currently on my machine, I have hadoop 2.1 installed.. and its source
>> >>> code is imported on Eclipse. What I can't grasp is   how to proceed
>> from
>> >>> there?
>> >>>
>> >>> I want to modify HDFS code (blockplacement strategy).. Now building
>> hdfs
>> >>> project via generates errors to me (unresolved types in hadoop
>> common).  and
>> >>> if i built successfully, how to test my modified code?
>> >>>
>> >>>
>> >>> --
>> >>> Best Regards,
>> >>> Karim Ahmed Awara
>> >>>
>> >>>
>> >>>
>> >>> ________________________________
>> >>>
>> >>> This message and its contents, including attachments are intended
>> solely
>> >>> for the original recipient. If you are not the intended recipient or
>> have
>> >>> received this message in error, please notify me immediately and
>> delete this
>> >>> message from your computer system. Any unauthorized use or
>> distribution is
>> >>> prohibited. Please consider the environment before printing this
>> email.
>> >>
>> >>
>> >
>> >
>> > ________________________________
>> > This message and its contents, including attachments are intended
>> solely for
>> > the original recipient. If you are not the intended recipient or have
>> > received this message in error, please notify me immediately and delete
>> this
>> > message from your computer system. Any unauthorized use or distribution
>> is
>> > prohibited. Please consider the environment before printing this email.
>>
>>
>>
>> --
>> - Tsuyoshi
>>
>
>

-- 

------------------------------
This message and its contents, including attachments are intended solely 
for the original recipient. If you are not the intended recipient or have 
received this message in error, please notify me immediately and delete 
this message from your computer system. Any unauthorized use or 
distribution is prohibited. Please consider the environment before printing 
this email.

Re: Hadoop 2.x with Eclipse

Posted by Karim Awara <ka...@kaust.edu.sa>.
Also my problem when importing the compiled hadoop to eclipse is that it
does not identify certain types  such as  "AvroRecord" that is used in
hadoop-common project and  is declared at this path
"/hadoop-common-project/hadoop-common/src/test/avro/avroRecord.avsc".. so
somehow Eclipse does not have an idea about this file which generates an
error build from eclipse as "a unit test".

--
Best Regards,
Karim Ahmed Awara


On Mon, Nov 4, 2013 at 8:09 AM, Karim Awara <ka...@kaust.edu.sa>wrote:

>
> The thing is.. when I downloaded the source code and compiled it with
> maven. There exist  no configuration files to configure. So I assume maven
> has its own way of test the unit tests... or am I missing something?
>
> --
> Best Regards,
> Karim Ahmed Awara
>
>
> On Mon, Nov 4, 2013 at 7:53 AM, Tsuyoshi OZAWA <oz...@gmail.com>wrote:
>
>> Instead of Ted's approach, it's also useful to use surefire plugin
>> when you debug tests.
>>
>> mvn test -Dmaven.surefire.debug -Dtest=TestClassName
>>
>> This commands accept debugger's attach on 5005 port by default, so you
>> can attach via eclipse's debugger. Then the test runs and you can use
>> debugger. I think the source code is needed to be compiled in your
>> local environment instead of just downloading it from hadoop's
>> release.
>>
>> Thanks,
>> Tsuyoshi
>>
>> On Sun, Nov 3, 2013 at 11:22 PM, Karim Awara <ka...@kaust.edu.sa>
>> wrote:
>> > Hi guys,
>> >
>> > Can I just install the HDFS project and debug it? (assuming I am
>> running a
>> > <put> command through the command line). If so, which project should I
>> > download (hadoop project that has hdfs)?
>> >
>> > --
>> > Best Regards,
>> > Karim Ahmed Awara
>> >
>> >
>> > On Sun, Oct 6, 2013 at 5:40 PM, Ted Yu <yu...@gmail.com> wrote:
>> >>
>> >> Karim:
>> >> If you want to debug unit tests, using Eclipse is a viable approach.
>> >> Here is what I did the past week debugging certain part of hadoop
>> >> (JobSubmitter in particular) through an HBase unit test.
>> >>
>> >> Run 'mvn install -DskipTests' to install hadoop locally
>> >> Open the class you want to debug and place breakpoint at proper
>> location
>> >> Open unit test which depends on the class above and select Debug As ->
>> >> JUnit Test
>> >> When breakpoint hits, associate the sources.jar file in local maven
>> repo
>> >> with the class. In my case, the sources jar file is located under
>> >>
>> ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT
>> >>
>> >> You should be able to step through hadoop code as usual at this point.
>> >>
>> >> Cheers
>> >>
>> >>
>> >> On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>
>> >> wrote:
>> >>>
>> >>> Karim,
>> >>>
>> >>>
>> >>>
>> >>> I am not an experienced Hadoop programmer, but what I found was that
>> >>> building and debugging Hadoop under Eclipse was very difficult, and I
>> was
>> >>> never to make it work correctly.  I suggest using the well documented
>> >>> command-line Maven build, installing Hadoop from that build, and
>> running it
>> >>> normally.  Once you have that working, run your namemode or datanode
>> daemon
>> >>> so as to wait for a remote debugger attach before starting.  You
>> should also
>> >>> get comfortable with log4j, the logging framework used by Hadoop, as
>> those
>> >>> log files are often your best friend when trying to debug a
>> collection of
>> >>> services.
>> >>>
>> >>>
>> >>>
>> >>> john
>> >>>
>> >>>
>> >>>
>> >>> From: Karim Awara [mailto:karim.awara@kaust.edu.sa]
>> >>> Sent: Sunday, October 06, 2013 5:41 AM
>> >>> To: user
>> >>> Subject: Hadoop 2.x with Eclipse
>> >>>
>> >>>
>> >>>
>> >>> Hi,
>> >>>
>> >>> I followed the instructions on how to import hadoop files to Eclipse
>> (I
>> >>> am using hadoop 2.1 beta).
>> >>>
>> >>> Currently on my machine, I have hadoop 2.1 installed.. and its source
>> >>> code is imported on Eclipse. What I can't grasp is   how to proceed
>> from
>> >>> there?
>> >>>
>> >>> I want to modify HDFS code (blockplacement strategy).. Now building
>> hdfs
>> >>> project via generates errors to me (unresolved types in hadoop
>> common).  and
>> >>> if i built successfully, how to test my modified code?
>> >>>
>> >>>
>> >>> --
>> >>> Best Regards,
>> >>> Karim Ahmed Awara
>> >>>
>> >>>
>> >>>
>> >>> ________________________________
>> >>>
>> >>> This message and its contents, including attachments are intended
>> solely
>> >>> for the original recipient. If you are not the intended recipient or
>> have
>> >>> received this message in error, please notify me immediately and
>> delete this
>> >>> message from your computer system. Any unauthorized use or
>> distribution is
>> >>> prohibited. Please consider the environment before printing this
>> email.
>> >>
>> >>
>> >
>> >
>> > ________________________________
>> > This message and its contents, including attachments are intended
>> solely for
>> > the original recipient. If you are not the intended recipient or have
>> > received this message in error, please notify me immediately and delete
>> this
>> > message from your computer system. Any unauthorized use or distribution
>> is
>> > prohibited. Please consider the environment before printing this email.
>>
>>
>>
>> --
>> - Tsuyoshi
>>
>
>

-- 

------------------------------
This message and its contents, including attachments are intended solely 
for the original recipient. If you are not the intended recipient or have 
received this message in error, please notify me immediately and delete 
this message from your computer system. Any unauthorized use or 
distribution is prohibited. Please consider the environment before printing 
this email.

Re: Hadoop 2.x with Eclipse

Posted by Karim Awara <ka...@kaust.edu.sa>.
Also my problem when importing the compiled hadoop to eclipse is that it
does not identify certain types  such as  "AvroRecord" that is used in
hadoop-common project and  is declared at this path
"/hadoop-common-project/hadoop-common/src/test/avro/avroRecord.avsc".. so
somehow Eclipse does not have an idea about this file which generates an
error build from eclipse as "a unit test".

--
Best Regards,
Karim Ahmed Awara


On Mon, Nov 4, 2013 at 8:09 AM, Karim Awara <ka...@kaust.edu.sa>wrote:

>
> The thing is.. when I downloaded the source code and compiled it with
> maven. There exist  no configuration files to configure. So I assume maven
> has its own way of test the unit tests... or am I missing something?
>
> --
> Best Regards,
> Karim Ahmed Awara
>
>
> On Mon, Nov 4, 2013 at 7:53 AM, Tsuyoshi OZAWA <oz...@gmail.com>wrote:
>
>> Instead of Ted's approach, it's also useful to use surefire plugin
>> when you debug tests.
>>
>> mvn test -Dmaven.surefire.debug -Dtest=TestClassName
>>
>> This commands accept debugger's attach on 5005 port by default, so you
>> can attach via eclipse's debugger. Then the test runs and you can use
>> debugger. I think the source code is needed to be compiled in your
>> local environment instead of just downloading it from hadoop's
>> release.
>>
>> Thanks,
>> Tsuyoshi
>>
>> On Sun, Nov 3, 2013 at 11:22 PM, Karim Awara <ka...@kaust.edu.sa>
>> wrote:
>> > Hi guys,
>> >
>> > Can I just install the HDFS project and debug it? (assuming I am
>> running a
>> > <put> command through the command line). If so, which project should I
>> > download (hadoop project that has hdfs)?
>> >
>> > --
>> > Best Regards,
>> > Karim Ahmed Awara
>> >
>> >
>> > On Sun, Oct 6, 2013 at 5:40 PM, Ted Yu <yu...@gmail.com> wrote:
>> >>
>> >> Karim:
>> >> If you want to debug unit tests, using Eclipse is a viable approach.
>> >> Here is what I did the past week debugging certain part of hadoop
>> >> (JobSubmitter in particular) through an HBase unit test.
>> >>
>> >> Run 'mvn install -DskipTests' to install hadoop locally
>> >> Open the class you want to debug and place breakpoint at proper
>> location
>> >> Open unit test which depends on the class above and select Debug As ->
>> >> JUnit Test
>> >> When breakpoint hits, associate the sources.jar file in local maven
>> repo
>> >> with the class. In my case, the sources jar file is located under
>> >>
>> ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT
>> >>
>> >> You should be able to step through hadoop code as usual at this point.
>> >>
>> >> Cheers
>> >>
>> >>
>> >> On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>
>> >> wrote:
>> >>>
>> >>> Karim,
>> >>>
>> >>>
>> >>>
>> >>> I am not an experienced Hadoop programmer, but what I found was that
>> >>> building and debugging Hadoop under Eclipse was very difficult, and I
>> was
>> >>> never to make it work correctly.  I suggest using the well documented
>> >>> command-line Maven build, installing Hadoop from that build, and
>> running it
>> >>> normally.  Once you have that working, run your namemode or datanode
>> daemon
>> >>> so as to wait for a remote debugger attach before starting.  You
>> should also
>> >>> get comfortable with log4j, the logging framework used by Hadoop, as
>> those
>> >>> log files are often your best friend when trying to debug a
>> collection of
>> >>> services.
>> >>>
>> >>>
>> >>>
>> >>> john
>> >>>
>> >>>
>> >>>
>> >>> From: Karim Awara [mailto:karim.awara@kaust.edu.sa]
>> >>> Sent: Sunday, October 06, 2013 5:41 AM
>> >>> To: user
>> >>> Subject: Hadoop 2.x with Eclipse
>> >>>
>> >>>
>> >>>
>> >>> Hi,
>> >>>
>> >>> I followed the instructions on how to import hadoop files to Eclipse
>> (I
>> >>> am using hadoop 2.1 beta).
>> >>>
>> >>> Currently on my machine, I have hadoop 2.1 installed.. and its source
>> >>> code is imported on Eclipse. What I can't grasp is   how to proceed
>> from
>> >>> there?
>> >>>
>> >>> I want to modify HDFS code (blockplacement strategy).. Now building
>> hdfs
>> >>> project via generates errors to me (unresolved types in hadoop
>> common).  and
>> >>> if i built successfully, how to test my modified code?
>> >>>
>> >>>
>> >>> --
>> >>> Best Regards,
>> >>> Karim Ahmed Awara
>> >>>
>> >>>
>> >>>
>> >>> ________________________________
>> >>>
>> >>> This message and its contents, including attachments are intended
>> solely
>> >>> for the original recipient. If you are not the intended recipient or
>> have
>> >>> received this message in error, please notify me immediately and
>> delete this
>> >>> message from your computer system. Any unauthorized use or
>> distribution is
>> >>> prohibited. Please consider the environment before printing this
>> email.
>> >>
>> >>
>> >
>> >
>> > ________________________________
>> > This message and its contents, including attachments are intended
>> solely for
>> > the original recipient. If you are not the intended recipient or have
>> > received this message in error, please notify me immediately and delete
>> this
>> > message from your computer system. Any unauthorized use or distribution
>> is
>> > prohibited. Please consider the environment before printing this email.
>>
>>
>>
>> --
>> - Tsuyoshi
>>
>
>

-- 

------------------------------
This message and its contents, including attachments are intended solely 
for the original recipient. If you are not the intended recipient or have 
received this message in error, please notify me immediately and delete 
this message from your computer system. Any unauthorized use or 
distribution is prohibited. Please consider the environment before printing 
this email.

Re: Hadoop 2.x with Eclipse

Posted by Karim Awara <ka...@kaust.edu.sa>.
Also my problem when importing the compiled hadoop to eclipse is that it
does not identify certain types  such as  "AvroRecord" that is used in
hadoop-common project and  is declared at this path
"/hadoop-common-project/hadoop-common/src/test/avro/avroRecord.avsc".. so
somehow Eclipse does not have an idea about this file which generates an
error build from eclipse as "a unit test".

--
Best Regards,
Karim Ahmed Awara


On Mon, Nov 4, 2013 at 8:09 AM, Karim Awara <ka...@kaust.edu.sa>wrote:

>
> The thing is.. when I downloaded the source code and compiled it with
> maven. There exist  no configuration files to configure. So I assume maven
> has its own way of test the unit tests... or am I missing something?
>
> --
> Best Regards,
> Karim Ahmed Awara
>
>
> On Mon, Nov 4, 2013 at 7:53 AM, Tsuyoshi OZAWA <oz...@gmail.com>wrote:
>
>> Instead of Ted's approach, it's also useful to use surefire plugin
>> when you debug tests.
>>
>> mvn test -Dmaven.surefire.debug -Dtest=TestClassName
>>
>> This commands accept debugger's attach on 5005 port by default, so you
>> can attach via eclipse's debugger. Then the test runs and you can use
>> debugger. I think the source code is needed to be compiled in your
>> local environment instead of just downloading it from hadoop's
>> release.
>>
>> Thanks,
>> Tsuyoshi
>>
>> On Sun, Nov 3, 2013 at 11:22 PM, Karim Awara <ka...@kaust.edu.sa>
>> wrote:
>> > Hi guys,
>> >
>> > Can I just install the HDFS project and debug it? (assuming I am
>> running a
>> > <put> command through the command line). If so, which project should I
>> > download (hadoop project that has hdfs)?
>> >
>> > --
>> > Best Regards,
>> > Karim Ahmed Awara
>> >
>> >
>> > On Sun, Oct 6, 2013 at 5:40 PM, Ted Yu <yu...@gmail.com> wrote:
>> >>
>> >> Karim:
>> >> If you want to debug unit tests, using Eclipse is a viable approach.
>> >> Here is what I did the past week debugging certain part of hadoop
>> >> (JobSubmitter in particular) through an HBase unit test.
>> >>
>> >> Run 'mvn install -DskipTests' to install hadoop locally
>> >> Open the class you want to debug and place breakpoint at proper
>> location
>> >> Open unit test which depends on the class above and select Debug As ->
>> >> JUnit Test
>> >> When breakpoint hits, associate the sources.jar file in local maven
>> repo
>> >> with the class. In my case, the sources jar file is located under
>> >>
>> ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT
>> >>
>> >> You should be able to step through hadoop code as usual at this point.
>> >>
>> >> Cheers
>> >>
>> >>
>> >> On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>
>> >> wrote:
>> >>>
>> >>> Karim,
>> >>>
>> >>>
>> >>>
>> >>> I am not an experienced Hadoop programmer, but what I found was that
>> >>> building and debugging Hadoop under Eclipse was very difficult, and I
>> was
>> >>> never to make it work correctly.  I suggest using the well documented
>> >>> command-line Maven build, installing Hadoop from that build, and
>> running it
>> >>> normally.  Once you have that working, run your namemode or datanode
>> daemon
>> >>> so as to wait for a remote debugger attach before starting.  You
>> should also
>> >>> get comfortable with log4j, the logging framework used by Hadoop, as
>> those
>> >>> log files are often your best friend when trying to debug a
>> collection of
>> >>> services.
>> >>>
>> >>>
>> >>>
>> >>> john
>> >>>
>> >>>
>> >>>
>> >>> From: Karim Awara [mailto:karim.awara@kaust.edu.sa]
>> >>> Sent: Sunday, October 06, 2013 5:41 AM
>> >>> To: user
>> >>> Subject: Hadoop 2.x with Eclipse
>> >>>
>> >>>
>> >>>
>> >>> Hi,
>> >>>
>> >>> I followed the instructions on how to import hadoop files to Eclipse
>> (I
>> >>> am using hadoop 2.1 beta).
>> >>>
>> >>> Currently on my machine, I have hadoop 2.1 installed.. and its source
>> >>> code is imported on Eclipse. What I can't grasp is   how to proceed
>> from
>> >>> there?
>> >>>
>> >>> I want to modify HDFS code (blockplacement strategy).. Now building
>> hdfs
>> >>> project via generates errors to me (unresolved types in hadoop
>> common).  and
>> >>> if i built successfully, how to test my modified code?
>> >>>
>> >>>
>> >>> --
>> >>> Best Regards,
>> >>> Karim Ahmed Awara
>> >>>
>> >>>
>> >>>
>> >>> ________________________________
>> >>>
>> >>> This message and its contents, including attachments are intended
>> solely
>> >>> for the original recipient. If you are not the intended recipient or
>> have
>> >>> received this message in error, please notify me immediately and
>> delete this
>> >>> message from your computer system. Any unauthorized use or
>> distribution is
>> >>> prohibited. Please consider the environment before printing this
>> email.
>> >>
>> >>
>> >
>> >
>> > ________________________________
>> > This message and its contents, including attachments are intended
>> solely for
>> > the original recipient. If you are not the intended recipient or have
>> > received this message in error, please notify me immediately and delete
>> this
>> > message from your computer system. Any unauthorized use or distribution
>> is
>> > prohibited. Please consider the environment before printing this email.
>>
>>
>>
>> --
>> - Tsuyoshi
>>
>
>

-- 

------------------------------
This message and its contents, including attachments are intended solely 
for the original recipient. If you are not the intended recipient or have 
received this message in error, please notify me immediately and delete 
this message from your computer system. Any unauthorized use or 
distribution is prohibited. Please consider the environment before printing 
this email.

Re: Hadoop 2.x with Eclipse

Posted by Karim Awara <ka...@kaust.edu.sa>.
The thing is.. when I downloaded the source code and compiled it with
maven. There exist  no configuration files to configure. So I assume maven
has its own way of test the unit tests... or am I missing something?

--
Best Regards,
Karim Ahmed Awara


On Mon, Nov 4, 2013 at 7:53 AM, Tsuyoshi OZAWA <oz...@gmail.com>wrote:

> Instead of Ted's approach, it's also useful to use surefire plugin
> when you debug tests.
>
> mvn test -Dmaven.surefire.debug -Dtest=TestClassName
>
> This commands accept debugger's attach on 5005 port by default, so you
> can attach via eclipse's debugger. Then the test runs and you can use
> debugger. I think the source code is needed to be compiled in your
> local environment instead of just downloading it from hadoop's
> release.
>
> Thanks,
> Tsuyoshi
>
> On Sun, Nov 3, 2013 at 11:22 PM, Karim Awara <ka...@kaust.edu.sa>
> wrote:
> > Hi guys,
> >
> > Can I just install the HDFS project and debug it? (assuming I am running
> a
> > <put> command through the command line). If so, which project should I
> > download (hadoop project that has hdfs)?
> >
> > --
> > Best Regards,
> > Karim Ahmed Awara
> >
> >
> > On Sun, Oct 6, 2013 at 5:40 PM, Ted Yu <yu...@gmail.com> wrote:
> >>
> >> Karim:
> >> If you want to debug unit tests, using Eclipse is a viable approach.
> >> Here is what I did the past week debugging certain part of hadoop
> >> (JobSubmitter in particular) through an HBase unit test.
> >>
> >> Run 'mvn install -DskipTests' to install hadoop locally
> >> Open the class you want to debug and place breakpoint at proper location
> >> Open unit test which depends on the class above and select Debug As ->
> >> JUnit Test
> >> When breakpoint hits, associate the sources.jar file in local maven repo
> >> with the class. In my case, the sources jar file is located under
> >>
> ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT
> >>
> >> You should be able to step through hadoop code as usual at this point.
> >>
> >> Cheers
> >>
> >>
> >> On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>
> >> wrote:
> >>>
> >>> Karim,
> >>>
> >>>
> >>>
> >>> I am not an experienced Hadoop programmer, but what I found was that
> >>> building and debugging Hadoop under Eclipse was very difficult, and I
> was
> >>> never to make it work correctly.  I suggest using the well documented
> >>> command-line Maven build, installing Hadoop from that build, and
> running it
> >>> normally.  Once you have that working, run your namemode or datanode
> daemon
> >>> so as to wait for a remote debugger attach before starting.  You
> should also
> >>> get comfortable with log4j, the logging framework used by Hadoop, as
> those
> >>> log files are often your best friend when trying to debug a collection
> of
> >>> services.
> >>>
> >>>
> >>>
> >>> john
> >>>
> >>>
> >>>
> >>> From: Karim Awara [mailto:karim.awara@kaust.edu.sa]
> >>> Sent: Sunday, October 06, 2013 5:41 AM
> >>> To: user
> >>> Subject: Hadoop 2.x with Eclipse
> >>>
> >>>
> >>>
> >>> Hi,
> >>>
> >>> I followed the instructions on how to import hadoop files to Eclipse (I
> >>> am using hadoop 2.1 beta).
> >>>
> >>> Currently on my machine, I have hadoop 2.1 installed.. and its source
> >>> code is imported on Eclipse. What I can't grasp is   how to proceed
> from
> >>> there?
> >>>
> >>> I want to modify HDFS code (blockplacement strategy).. Now building
> hdfs
> >>> project via generates errors to me (unresolved types in hadoop
> common).  and
> >>> if i built successfully, how to test my modified code?
> >>>
> >>>
> >>> --
> >>> Best Regards,
> >>> Karim Ahmed Awara
> >>>
> >>>
> >>>
> >>> ________________________________
> >>>
> >>> This message and its contents, including attachments are intended
> solely
> >>> for the original recipient. If you are not the intended recipient or
> have
> >>> received this message in error, please notify me immediately and
> delete this
> >>> message from your computer system. Any unauthorized use or
> distribution is
> >>> prohibited. Please consider the environment before printing this email.
> >>
> >>
> >
> >
> > ________________________________
> > This message and its contents, including attachments are intended solely
> for
> > the original recipient. If you are not the intended recipient or have
> > received this message in error, please notify me immediately and delete
> this
> > message from your computer system. Any unauthorized use or distribution
> is
> > prohibited. Please consider the environment before printing this email.
>
>
>
> --
> - Tsuyoshi
>

-- 

------------------------------
This message and its contents, including attachments are intended solely 
for the original recipient. If you are not the intended recipient or have 
received this message in error, please notify me immediately and delete 
this message from your computer system. Any unauthorized use or 
distribution is prohibited. Please consider the environment before printing 
this email.

Re: Hadoop 2.x with Eclipse

Posted by Karim Awara <ka...@kaust.edu.sa>.
The thing is.. when I downloaded the source code and compiled it with
maven. There exist  no configuration files to configure. So I assume maven
has its own way of test the unit tests... or am I missing something?

--
Best Regards,
Karim Ahmed Awara


On Mon, Nov 4, 2013 at 7:53 AM, Tsuyoshi OZAWA <oz...@gmail.com>wrote:

> Instead of Ted's approach, it's also useful to use surefire plugin
> when you debug tests.
>
> mvn test -Dmaven.surefire.debug -Dtest=TestClassName
>
> This commands accept debugger's attach on 5005 port by default, so you
> can attach via eclipse's debugger. Then the test runs and you can use
> debugger. I think the source code is needed to be compiled in your
> local environment instead of just downloading it from hadoop's
> release.
>
> Thanks,
> Tsuyoshi
>
> On Sun, Nov 3, 2013 at 11:22 PM, Karim Awara <ka...@kaust.edu.sa>
> wrote:
> > Hi guys,
> >
> > Can I just install the HDFS project and debug it? (assuming I am running
> a
> > <put> command through the command line). If so, which project should I
> > download (hadoop project that has hdfs)?
> >
> > --
> > Best Regards,
> > Karim Ahmed Awara
> >
> >
> > On Sun, Oct 6, 2013 at 5:40 PM, Ted Yu <yu...@gmail.com> wrote:
> >>
> >> Karim:
> >> If you want to debug unit tests, using Eclipse is a viable approach.
> >> Here is what I did the past week debugging certain part of hadoop
> >> (JobSubmitter in particular) through an HBase unit test.
> >>
> >> Run 'mvn install -DskipTests' to install hadoop locally
> >> Open the class you want to debug and place breakpoint at proper location
> >> Open unit test which depends on the class above and select Debug As ->
> >> JUnit Test
> >> When breakpoint hits, associate the sources.jar file in local maven repo
> >> with the class. In my case, the sources jar file is located under
> >>
> ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT
> >>
> >> You should be able to step through hadoop code as usual at this point.
> >>
> >> Cheers
> >>
> >>
> >> On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>
> >> wrote:
> >>>
> >>> Karim,
> >>>
> >>>
> >>>
> >>> I am not an experienced Hadoop programmer, but what I found was that
> >>> building and debugging Hadoop under Eclipse was very difficult, and I
> was
> >>> never to make it work correctly.  I suggest using the well documented
> >>> command-line Maven build, installing Hadoop from that build, and
> running it
> >>> normally.  Once you have that working, run your namemode or datanode
> daemon
> >>> so as to wait for a remote debugger attach before starting.  You
> should also
> >>> get comfortable with log4j, the logging framework used by Hadoop, as
> those
> >>> log files are often your best friend when trying to debug a collection
> of
> >>> services.
> >>>
> >>>
> >>>
> >>> john
> >>>
> >>>
> >>>
> >>> From: Karim Awara [mailto:karim.awara@kaust.edu.sa]
> >>> Sent: Sunday, October 06, 2013 5:41 AM
> >>> To: user
> >>> Subject: Hadoop 2.x with Eclipse
> >>>
> >>>
> >>>
> >>> Hi,
> >>>
> >>> I followed the instructions on how to import hadoop files to Eclipse (I
> >>> am using hadoop 2.1 beta).
> >>>
> >>> Currently on my machine, I have hadoop 2.1 installed.. and its source
> >>> code is imported on Eclipse. What I can't grasp is   how to proceed
> from
> >>> there?
> >>>
> >>> I want to modify HDFS code (blockplacement strategy).. Now building
> hdfs
> >>> project via generates errors to me (unresolved types in hadoop
> common).  and
> >>> if i built successfully, how to test my modified code?
> >>>
> >>>
> >>> --
> >>> Best Regards,
> >>> Karim Ahmed Awara
> >>>
> >>>
> >>>
> >>> ________________________________
> >>>
> >>> This message and its contents, including attachments are intended
> solely
> >>> for the original recipient. If you are not the intended recipient or
> have
> >>> received this message in error, please notify me immediately and
> delete this
> >>> message from your computer system. Any unauthorized use or
> distribution is
> >>> prohibited. Please consider the environment before printing this email.
> >>
> >>
> >
> >
> > ________________________________
> > This message and its contents, including attachments are intended solely
> for
> > the original recipient. If you are not the intended recipient or have
> > received this message in error, please notify me immediately and delete
> this
> > message from your computer system. Any unauthorized use or distribution
> is
> > prohibited. Please consider the environment before printing this email.
>
>
>
> --
> - Tsuyoshi
>

-- 

------------------------------
This message and its contents, including attachments are intended solely 
for the original recipient. If you are not the intended recipient or have 
received this message in error, please notify me immediately and delete 
this message from your computer system. Any unauthorized use or 
distribution is prohibited. Please consider the environment before printing 
this email.

Re: Hadoop 2.x with Eclipse

Posted by Karim Awara <ka...@kaust.edu.sa>.
The thing is.. when I downloaded the source code and compiled it with
maven. There exist  no configuration files to configure. So I assume maven
has its own way of test the unit tests... or am I missing something?

--
Best Regards,
Karim Ahmed Awara


On Mon, Nov 4, 2013 at 7:53 AM, Tsuyoshi OZAWA <oz...@gmail.com>wrote:

> Instead of Ted's approach, it's also useful to use surefire plugin
> when you debug tests.
>
> mvn test -Dmaven.surefire.debug -Dtest=TestClassName
>
> This commands accept debugger's attach on 5005 port by default, so you
> can attach via eclipse's debugger. Then the test runs and you can use
> debugger. I think the source code is needed to be compiled in your
> local environment instead of just downloading it from hadoop's
> release.
>
> Thanks,
> Tsuyoshi
>
> On Sun, Nov 3, 2013 at 11:22 PM, Karim Awara <ka...@kaust.edu.sa>
> wrote:
> > Hi guys,
> >
> > Can I just install the HDFS project and debug it? (assuming I am running
> a
> > <put> command through the command line). If so, which project should I
> > download (hadoop project that has hdfs)?
> >
> > --
> > Best Regards,
> > Karim Ahmed Awara
> >
> >
> > On Sun, Oct 6, 2013 at 5:40 PM, Ted Yu <yu...@gmail.com> wrote:
> >>
> >> Karim:
> >> If you want to debug unit tests, using Eclipse is a viable approach.
> >> Here is what I did the past week debugging certain part of hadoop
> >> (JobSubmitter in particular) through an HBase unit test.
> >>
> >> Run 'mvn install -DskipTests' to install hadoop locally
> >> Open the class you want to debug and place breakpoint at proper location
> >> Open unit test which depends on the class above and select Debug As ->
> >> JUnit Test
> >> When breakpoint hits, associate the sources.jar file in local maven repo
> >> with the class. In my case, the sources jar file is located under
> >>
> ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT
> >>
> >> You should be able to step through hadoop code as usual at this point.
> >>
> >> Cheers
> >>
> >>
> >> On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>
> >> wrote:
> >>>
> >>> Karim,
> >>>
> >>>
> >>>
> >>> I am not an experienced Hadoop programmer, but what I found was that
> >>> building and debugging Hadoop under Eclipse was very difficult, and I
> was
> >>> never to make it work correctly.  I suggest using the well documented
> >>> command-line Maven build, installing Hadoop from that build, and
> running it
> >>> normally.  Once you have that working, run your namemode or datanode
> daemon
> >>> so as to wait for a remote debugger attach before starting.  You
> should also
> >>> get comfortable with log4j, the logging framework used by Hadoop, as
> those
> >>> log files are often your best friend when trying to debug a collection
> of
> >>> services.
> >>>
> >>>
> >>>
> >>> john
> >>>
> >>>
> >>>
> >>> From: Karim Awara [mailto:karim.awara@kaust.edu.sa]
> >>> Sent: Sunday, October 06, 2013 5:41 AM
> >>> To: user
> >>> Subject: Hadoop 2.x with Eclipse
> >>>
> >>>
> >>>
> >>> Hi,
> >>>
> >>> I followed the instructions on how to import hadoop files to Eclipse (I
> >>> am using hadoop 2.1 beta).
> >>>
> >>> Currently on my machine, I have hadoop 2.1 installed.. and its source
> >>> code is imported on Eclipse. What I can't grasp is   how to proceed
> from
> >>> there?
> >>>
> >>> I want to modify HDFS code (blockplacement strategy).. Now building
> hdfs
> >>> project via generates errors to me (unresolved types in hadoop
> common).  and
> >>> if i built successfully, how to test my modified code?
> >>>
> >>>
> >>> --
> >>> Best Regards,
> >>> Karim Ahmed Awara
> >>>
> >>>
> >>>
> >>> ________________________________
> >>>
> >>> This message and its contents, including attachments are intended
> solely
> >>> for the original recipient. If you are not the intended recipient or
> have
> >>> received this message in error, please notify me immediately and
> delete this
> >>> message from your computer system. Any unauthorized use or
> distribution is
> >>> prohibited. Please consider the environment before printing this email.
> >>
> >>
> >
> >
> > ________________________________
> > This message and its contents, including attachments are intended solely
> for
> > the original recipient. If you are not the intended recipient or have
> > received this message in error, please notify me immediately and delete
> this
> > message from your computer system. Any unauthorized use or distribution
> is
> > prohibited. Please consider the environment before printing this email.
>
>
>
> --
> - Tsuyoshi
>

-- 

------------------------------
This message and its contents, including attachments are intended solely 
for the original recipient. If you are not the intended recipient or have 
received this message in error, please notify me immediately and delete 
this message from your computer system. Any unauthorized use or 
distribution is prohibited. Please consider the environment before printing 
this email.

Re: Hadoop 2.x with Eclipse

Posted by Karim Awara <ka...@kaust.edu.sa>.
The thing is.. when I downloaded the source code and compiled it with
maven. There exist  no configuration files to configure. So I assume maven
has its own way of test the unit tests... or am I missing something?

--
Best Regards,
Karim Ahmed Awara


On Mon, Nov 4, 2013 at 7:53 AM, Tsuyoshi OZAWA <oz...@gmail.com>wrote:

> Instead of Ted's approach, it's also useful to use surefire plugin
> when you debug tests.
>
> mvn test -Dmaven.surefire.debug -Dtest=TestClassName
>
> This commands accept debugger's attach on 5005 port by default, so you
> can attach via eclipse's debugger. Then the test runs and you can use
> debugger. I think the source code is needed to be compiled in your
> local environment instead of just downloading it from hadoop's
> release.
>
> Thanks,
> Tsuyoshi
>
> On Sun, Nov 3, 2013 at 11:22 PM, Karim Awara <ka...@kaust.edu.sa>
> wrote:
> > Hi guys,
> >
> > Can I just install the HDFS project and debug it? (assuming I am running
> a
> > <put> command through the command line). If so, which project should I
> > download (hadoop project that has hdfs)?
> >
> > --
> > Best Regards,
> > Karim Ahmed Awara
> >
> >
> > On Sun, Oct 6, 2013 at 5:40 PM, Ted Yu <yu...@gmail.com> wrote:
> >>
> >> Karim:
> >> If you want to debug unit tests, using Eclipse is a viable approach.
> >> Here is what I did the past week debugging certain part of hadoop
> >> (JobSubmitter in particular) through an HBase unit test.
> >>
> >> Run 'mvn install -DskipTests' to install hadoop locally
> >> Open the class you want to debug and place breakpoint at proper location
> >> Open unit test which depends on the class above and select Debug As ->
> >> JUnit Test
> >> When breakpoint hits, associate the sources.jar file in local maven repo
> >> with the class. In my case, the sources jar file is located under
> >>
> ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT
> >>
> >> You should be able to step through hadoop code as usual at this point.
> >>
> >> Cheers
> >>
> >>
> >> On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>
> >> wrote:
> >>>
> >>> Karim,
> >>>
> >>>
> >>>
> >>> I am not an experienced Hadoop programmer, but what I found was that
> >>> building and debugging Hadoop under Eclipse was very difficult, and I
> was
> >>> never to make it work correctly.  I suggest using the well documented
> >>> command-line Maven build, installing Hadoop from that build, and
> running it
> >>> normally.  Once you have that working, run your namemode or datanode
> daemon
> >>> so as to wait for a remote debugger attach before starting.  You
> should also
> >>> get comfortable with log4j, the logging framework used by Hadoop, as
> those
> >>> log files are often your best friend when trying to debug a collection
> of
> >>> services.
> >>>
> >>>
> >>>
> >>> john
> >>>
> >>>
> >>>
> >>> From: Karim Awara [mailto:karim.awara@kaust.edu.sa]
> >>> Sent: Sunday, October 06, 2013 5:41 AM
> >>> To: user
> >>> Subject: Hadoop 2.x with Eclipse
> >>>
> >>>
> >>>
> >>> Hi,
> >>>
> >>> I followed the instructions on how to import hadoop files to Eclipse (I
> >>> am using hadoop 2.1 beta).
> >>>
> >>> Currently on my machine, I have hadoop 2.1 installed.. and its source
> >>> code is imported on Eclipse. What I can't grasp is   how to proceed
> from
> >>> there?
> >>>
> >>> I want to modify HDFS code (blockplacement strategy).. Now building
> hdfs
> >>> project via generates errors to me (unresolved types in hadoop
> common).  and
> >>> if i built successfully, how to test my modified code?
> >>>
> >>>
> >>> --
> >>> Best Regards,
> >>> Karim Ahmed Awara
> >>>
> >>>
> >>>
> >>> ________________________________
> >>>
> >>> This message and its contents, including attachments are intended
> solely
> >>> for the original recipient. If you are not the intended recipient or
> have
> >>> received this message in error, please notify me immediately and
> delete this
> >>> message from your computer system. Any unauthorized use or
> distribution is
> >>> prohibited. Please consider the environment before printing this email.
> >>
> >>
> >
> >
> > ________________________________
> > This message and its contents, including attachments are intended solely
> for
> > the original recipient. If you are not the intended recipient or have
> > received this message in error, please notify me immediately and delete
> this
> > message from your computer system. Any unauthorized use or distribution
> is
> > prohibited. Please consider the environment before printing this email.
>
>
>
> --
> - Tsuyoshi
>

-- 

------------------------------
This message and its contents, including attachments are intended solely 
for the original recipient. If you are not the intended recipient or have 
received this message in error, please notify me immediately and delete 
this message from your computer system. Any unauthorized use or 
distribution is prohibited. Please consider the environment before printing 
this email.

Re: Hadoop 2.x with Eclipse

Posted by Tsuyoshi OZAWA <oz...@gmail.com>.
Instead of Ted's approach, it's also useful to use surefire plugin
when you debug tests.

mvn test -Dmaven.surefire.debug -Dtest=TestClassName

This commands accept debugger's attach on 5005 port by default, so you
can attach via eclipse's debugger. Then the test runs and you can use
debugger. I think the source code is needed to be compiled in your
local environment instead of just downloading it from hadoop's
release.

Thanks,
Tsuyoshi

On Sun, Nov 3, 2013 at 11:22 PM, Karim Awara <ka...@kaust.edu.sa> wrote:
> Hi guys,
>
> Can I just install the HDFS project and debug it? (assuming I am running a
> <put> command through the command line). If so, which project should I
> download (hadoop project that has hdfs)?
>
> --
> Best Regards,
> Karim Ahmed Awara
>
>
> On Sun, Oct 6, 2013 at 5:40 PM, Ted Yu <yu...@gmail.com> wrote:
>>
>> Karim:
>> If you want to debug unit tests, using Eclipse is a viable approach.
>> Here is what I did the past week debugging certain part of hadoop
>> (JobSubmitter in particular) through an HBase unit test.
>>
>> Run 'mvn install -DskipTests' to install hadoop locally
>> Open the class you want to debug and place breakpoint at proper location
>> Open unit test which depends on the class above and select Debug As ->
>> JUnit Test
>> When breakpoint hits, associate the sources.jar file in local maven repo
>> with the class. In my case, the sources jar file is located under
>> ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT
>>
>> You should be able to step through hadoop code as usual at this point.
>>
>> Cheers
>>
>>
>> On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>
>> wrote:
>>>
>>> Karim,
>>>
>>>
>>>
>>> I am not an experienced Hadoop programmer, but what I found was that
>>> building and debugging Hadoop under Eclipse was very difficult, and I was
>>> never to make it work correctly.  I suggest using the well documented
>>> command-line Maven build, installing Hadoop from that build, and running it
>>> normally.  Once you have that working, run your namemode or datanode daemon
>>> so as to wait for a remote debugger attach before starting.  You should also
>>> get comfortable with log4j, the logging framework used by Hadoop, as those
>>> log files are often your best friend when trying to debug a collection of
>>> services.
>>>
>>>
>>>
>>> john
>>>
>>>
>>>
>>> From: Karim Awara [mailto:karim.awara@kaust.edu.sa]
>>> Sent: Sunday, October 06, 2013 5:41 AM
>>> To: user
>>> Subject: Hadoop 2.x with Eclipse
>>>
>>>
>>>
>>> Hi,
>>>
>>> I followed the instructions on how to import hadoop files to Eclipse (I
>>> am using hadoop 2.1 beta).
>>>
>>> Currently on my machine, I have hadoop 2.1 installed.. and its source
>>> code is imported on Eclipse. What I can't grasp is   how to proceed from
>>> there?
>>>
>>> I want to modify HDFS code (blockplacement strategy).. Now building hdfs
>>> project via generates errors to me (unresolved types in hadoop common).  and
>>> if i built successfully, how to test my modified code?
>>>
>>>
>>> --
>>> Best Regards,
>>> Karim Ahmed Awara
>>>
>>>
>>>
>>> ________________________________
>>>
>>> This message and its contents, including attachments are intended solely
>>> for the original recipient. If you are not the intended recipient or have
>>> received this message in error, please notify me immediately and delete this
>>> message from your computer system. Any unauthorized use or distribution is
>>> prohibited. Please consider the environment before printing this email.
>>
>>
>
>
> ________________________________
> This message and its contents, including attachments are intended solely for
> the original recipient. If you are not the intended recipient or have
> received this message in error, please notify me immediately and delete this
> message from your computer system. Any unauthorized use or distribution is
> prohibited. Please consider the environment before printing this email.



-- 
- Tsuyoshi

Re: Hadoop 2.x with Eclipse

Posted by Tsuyoshi OZAWA <oz...@gmail.com>.
Instead of Ted's approach, it's also useful to use surefire plugin
when you debug tests.

mvn test -Dmaven.surefire.debug -Dtest=TestClassName

This commands accept debugger's attach on 5005 port by default, so you
can attach via eclipse's debugger. Then the test runs and you can use
debugger. I think the source code is needed to be compiled in your
local environment instead of just downloading it from hadoop's
release.

Thanks,
Tsuyoshi

On Sun, Nov 3, 2013 at 11:22 PM, Karim Awara <ka...@kaust.edu.sa> wrote:
> Hi guys,
>
> Can I just install the HDFS project and debug it? (assuming I am running a
> <put> command through the command line). If so, which project should I
> download (hadoop project that has hdfs)?
>
> --
> Best Regards,
> Karim Ahmed Awara
>
>
> On Sun, Oct 6, 2013 at 5:40 PM, Ted Yu <yu...@gmail.com> wrote:
>>
>> Karim:
>> If you want to debug unit tests, using Eclipse is a viable approach.
>> Here is what I did the past week debugging certain part of hadoop
>> (JobSubmitter in particular) through an HBase unit test.
>>
>> Run 'mvn install -DskipTests' to install hadoop locally
>> Open the class you want to debug and place breakpoint at proper location
>> Open unit test which depends on the class above and select Debug As ->
>> JUnit Test
>> When breakpoint hits, associate the sources.jar file in local maven repo
>> with the class. In my case, the sources jar file is located under
>> ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT
>>
>> You should be able to step through hadoop code as usual at this point.
>>
>> Cheers
>>
>>
>> On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>
>> wrote:
>>>
>>> Karim,
>>>
>>>
>>>
>>> I am not an experienced Hadoop programmer, but what I found was that
>>> building and debugging Hadoop under Eclipse was very difficult, and I was
>>> never to make it work correctly.  I suggest using the well documented
>>> command-line Maven build, installing Hadoop from that build, and running it
>>> normally.  Once you have that working, run your namemode or datanode daemon
>>> so as to wait for a remote debugger attach before starting.  You should also
>>> get comfortable with log4j, the logging framework used by Hadoop, as those
>>> log files are often your best friend when trying to debug a collection of
>>> services.
>>>
>>>
>>>
>>> john
>>>
>>>
>>>
>>> From: Karim Awara [mailto:karim.awara@kaust.edu.sa]
>>> Sent: Sunday, October 06, 2013 5:41 AM
>>> To: user
>>> Subject: Hadoop 2.x with Eclipse
>>>
>>>
>>>
>>> Hi,
>>>
>>> I followed the instructions on how to import hadoop files to Eclipse (I
>>> am using hadoop 2.1 beta).
>>>
>>> Currently on my machine, I have hadoop 2.1 installed.. and its source
>>> code is imported on Eclipse. What I can't grasp is   how to proceed from
>>> there?
>>>
>>> I want to modify HDFS code (blockplacement strategy).. Now building hdfs
>>> project via generates errors to me (unresolved types in hadoop common).  and
>>> if i built successfully, how to test my modified code?
>>>
>>>
>>> --
>>> Best Regards,
>>> Karim Ahmed Awara
>>>
>>>
>>>
>>> ________________________________
>>>
>>> This message and its contents, including attachments are intended solely
>>> for the original recipient. If you are not the intended recipient or have
>>> received this message in error, please notify me immediately and delete this
>>> message from your computer system. Any unauthorized use or distribution is
>>> prohibited. Please consider the environment before printing this email.
>>
>>
>
>
> ________________________________
> This message and its contents, including attachments are intended solely for
> the original recipient. If you are not the intended recipient or have
> received this message in error, please notify me immediately and delete this
> message from your computer system. Any unauthorized use or distribution is
> prohibited. Please consider the environment before printing this email.



-- 
- Tsuyoshi

Re: Hadoop 2.x with Eclipse

Posted by Tsuyoshi OZAWA <oz...@gmail.com>.
Instead of Ted's approach, it's also useful to use surefire plugin
when you debug tests.

mvn test -Dmaven.surefire.debug -Dtest=TestClassName

This commands accept debugger's attach on 5005 port by default, so you
can attach via eclipse's debugger. Then the test runs and you can use
debugger. I think the source code is needed to be compiled in your
local environment instead of just downloading it from hadoop's
release.

Thanks,
Tsuyoshi

On Sun, Nov 3, 2013 at 11:22 PM, Karim Awara <ka...@kaust.edu.sa> wrote:
> Hi guys,
>
> Can I just install the HDFS project and debug it? (assuming I am running a
> <put> command through the command line). If so, which project should I
> download (hadoop project that has hdfs)?
>
> --
> Best Regards,
> Karim Ahmed Awara
>
>
> On Sun, Oct 6, 2013 at 5:40 PM, Ted Yu <yu...@gmail.com> wrote:
>>
>> Karim:
>> If you want to debug unit tests, using Eclipse is a viable approach.
>> Here is what I did the past week debugging certain part of hadoop
>> (JobSubmitter in particular) through an HBase unit test.
>>
>> Run 'mvn install -DskipTests' to install hadoop locally
>> Open the class you want to debug and place breakpoint at proper location
>> Open unit test which depends on the class above and select Debug As ->
>> JUnit Test
>> When breakpoint hits, associate the sources.jar file in local maven repo
>> with the class. In my case, the sources jar file is located under
>> ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT
>>
>> You should be able to step through hadoop code as usual at this point.
>>
>> Cheers
>>
>>
>> On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>
>> wrote:
>>>
>>> Karim,
>>>
>>>
>>>
>>> I am not an experienced Hadoop programmer, but what I found was that
>>> building and debugging Hadoop under Eclipse was very difficult, and I was
>>> never to make it work correctly.  I suggest using the well documented
>>> command-line Maven build, installing Hadoop from that build, and running it
>>> normally.  Once you have that working, run your namemode or datanode daemon
>>> so as to wait for a remote debugger attach before starting.  You should also
>>> get comfortable with log4j, the logging framework used by Hadoop, as those
>>> log files are often your best friend when trying to debug a collection of
>>> services.
>>>
>>>
>>>
>>> john
>>>
>>>
>>>
>>> From: Karim Awara [mailto:karim.awara@kaust.edu.sa]
>>> Sent: Sunday, October 06, 2013 5:41 AM
>>> To: user
>>> Subject: Hadoop 2.x with Eclipse
>>>
>>>
>>>
>>> Hi,
>>>
>>> I followed the instructions on how to import hadoop files to Eclipse (I
>>> am using hadoop 2.1 beta).
>>>
>>> Currently on my machine, I have hadoop 2.1 installed.. and its source
>>> code is imported on Eclipse. What I can't grasp is   how to proceed from
>>> there?
>>>
>>> I want to modify HDFS code (blockplacement strategy).. Now building hdfs
>>> project via generates errors to me (unresolved types in hadoop common).  and
>>> if i built successfully, how to test my modified code?
>>>
>>>
>>> --
>>> Best Regards,
>>> Karim Ahmed Awara
>>>
>>>
>>>
>>> ________________________________
>>>
>>> This message and its contents, including attachments are intended solely
>>> for the original recipient. If you are not the intended recipient or have
>>> received this message in error, please notify me immediately and delete this
>>> message from your computer system. Any unauthorized use or distribution is
>>> prohibited. Please consider the environment before printing this email.
>>
>>
>
>
> ________________________________
> This message and its contents, including attachments are intended solely for
> the original recipient. If you are not the intended recipient or have
> received this message in error, please notify me immediately and delete this
> message from your computer system. Any unauthorized use or distribution is
> prohibited. Please consider the environment before printing this email.



-- 
- Tsuyoshi

Re: Hadoop 2.x with Eclipse

Posted by Tsuyoshi OZAWA <oz...@gmail.com>.
Instead of Ted's approach, it's also useful to use surefire plugin
when you debug tests.

mvn test -Dmaven.surefire.debug -Dtest=TestClassName

This commands accept debugger's attach on 5005 port by default, so you
can attach via eclipse's debugger. Then the test runs and you can use
debugger. I think the source code is needed to be compiled in your
local environment instead of just downloading it from hadoop's
release.

Thanks,
Tsuyoshi

On Sun, Nov 3, 2013 at 11:22 PM, Karim Awara <ka...@kaust.edu.sa> wrote:
> Hi guys,
>
> Can I just install the HDFS project and debug it? (assuming I am running a
> <put> command through the command line). If so, which project should I
> download (hadoop project that has hdfs)?
>
> --
> Best Regards,
> Karim Ahmed Awara
>
>
> On Sun, Oct 6, 2013 at 5:40 PM, Ted Yu <yu...@gmail.com> wrote:
>>
>> Karim:
>> If you want to debug unit tests, using Eclipse is a viable approach.
>> Here is what I did the past week debugging certain part of hadoop
>> (JobSubmitter in particular) through an HBase unit test.
>>
>> Run 'mvn install -DskipTests' to install hadoop locally
>> Open the class you want to debug and place breakpoint at proper location
>> Open unit test which depends on the class above and select Debug As ->
>> JUnit Test
>> When breakpoint hits, associate the sources.jar file in local maven repo
>> with the class. In my case, the sources jar file is located under
>> ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT
>>
>> You should be able to step through hadoop code as usual at this point.
>>
>> Cheers
>>
>>
>> On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>
>> wrote:
>>>
>>> Karim,
>>>
>>>
>>>
>>> I am not an experienced Hadoop programmer, but what I found was that
>>> building and debugging Hadoop under Eclipse was very difficult, and I was
>>> never to make it work correctly.  I suggest using the well documented
>>> command-line Maven build, installing Hadoop from that build, and running it
>>> normally.  Once you have that working, run your namemode or datanode daemon
>>> so as to wait for a remote debugger attach before starting.  You should also
>>> get comfortable with log4j, the logging framework used by Hadoop, as those
>>> log files are often your best friend when trying to debug a collection of
>>> services.
>>>
>>>
>>>
>>> john
>>>
>>>
>>>
>>> From: Karim Awara [mailto:karim.awara@kaust.edu.sa]
>>> Sent: Sunday, October 06, 2013 5:41 AM
>>> To: user
>>> Subject: Hadoop 2.x with Eclipse
>>>
>>>
>>>
>>> Hi,
>>>
>>> I followed the instructions on how to import hadoop files to Eclipse (I
>>> am using hadoop 2.1 beta).
>>>
>>> Currently on my machine, I have hadoop 2.1 installed.. and its source
>>> code is imported on Eclipse. What I can't grasp is   how to proceed from
>>> there?
>>>
>>> I want to modify HDFS code (blockplacement strategy).. Now building hdfs
>>> project via generates errors to me (unresolved types in hadoop common).  and
>>> if i built successfully, how to test my modified code?
>>>
>>>
>>> --
>>> Best Regards,
>>> Karim Ahmed Awara
>>>
>>>
>>>
>>> ________________________________
>>>
>>> This message and its contents, including attachments are intended solely
>>> for the original recipient. If you are not the intended recipient or have
>>> received this message in error, please notify me immediately and delete this
>>> message from your computer system. Any unauthorized use or distribution is
>>> prohibited. Please consider the environment before printing this email.
>>
>>
>
>
> ________________________________
> This message and its contents, including attachments are intended solely for
> the original recipient. If you are not the intended recipient or have
> received this message in error, please notify me immediately and delete this
> message from your computer system. Any unauthorized use or distribution is
> prohibited. Please consider the environment before printing this email.



-- 
- Tsuyoshi

Re: Hadoop 2.x with Eclipse

Posted by Karim Awara <ka...@kaust.edu.sa>.
Hi guys,

Can I just install the HDFS project and debug it? (assuming I am running a
<put> command through the command line). If so, which project should I
download (hadoop project that has hdfs)?

--
Best Regards,
Karim Ahmed Awara


On Sun, Oct 6, 2013 at 5:40 PM, Ted Yu <yu...@gmail.com> wrote:

> Karim:
> If you want to debug unit tests, using Eclipse is a viable approach.
> Here is what I did the past week debugging certain part of hadoop
> (JobSubmitter in particular) through an HBase unit test.
>
> Run 'mvn install -DskipTests' to install hadoop locally
> Open the class you want to debug and place breakpoint at proper location
> Open unit test which depends on the class above and select Debug As ->
> JUnit Test
> When breakpoint hits, associate the sources.jar file in local maven repo
> with the class. In my case, the sources jar file is located
> under ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT
>
> You should be able to step through hadoop code as usual at this point.
>
> Cheers
>
>
> On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>wrote:
>
>>  Karim,
>>
>>
>>
>> I am not an experienced Hadoop programmer, but what I found was that
>> building and debugging Hadoop under Eclipse was very difficult, and I was
>> never to make it work correctly.  I suggest using the well documented
>> command-line Maven build, installing Hadoop from that build, and running it
>> normally.  Once you have that working, run your namemode or datanode daemon
>> so as to wait for a remote debugger attach before starting.  You should
>> also get comfortable with log4j, the logging framework used by Hadoop, as
>> those log files are often your best friend when trying to debug a
>> collection of services.
>>
>>
>>
>> john
>>
>>
>>
>> *From:* Karim Awara [mailto:karim.awara@kaust.edu.sa]
>> *Sent:* Sunday, October 06, 2013 5:41 AM
>> *To:* user
>> *Subject:* Hadoop 2.x with Eclipse
>>
>>
>>
>> Hi,
>>
>> I followed the instructions on how to import hadoop files to Eclipse (I
>> am using hadoop 2.1 beta).
>>
>> Currently on my machine, I have hadoop 2.1 installed.. and its source
>> code is imported on Eclipse. What I can't grasp is   how to proceed from
>> there?
>>
>> I want to modify HDFS code (blockplacement strategy).. Now building hdfs
>> project via generates errors to me (unresolved types in hadoop common).
>> and if i built successfully, how to test my modified code?
>>
>>
>>   --
>> Best Regards,
>> Karim Ahmed Awara
>>
>>
>>  ------------------------------
>>
>> This message and its contents, including attachments are intended solely
>> for the original recipient. If you are not the intended recipient or have
>> received this message in error, please notify me immediately and delete
>> this message from your computer system. Any unauthorized use or
>> distribution is prohibited. Please consider the environment before printing
>> this email.
>>
>
>

-- 

------------------------------
This message and its contents, including attachments are intended solely 
for the original recipient. If you are not the intended recipient or have 
received this message in error, please notify me immediately and delete 
this message from your computer system. Any unauthorized use or 
distribution is prohibited. Please consider the environment before printing 
this email.

Re: Hadoop 2.x with Eclipse

Posted by Karim Awara <ka...@kaust.edu.sa>.
Hi guys,

Can I just install the HDFS project and debug it? (assuming I am running a
<put> command through the command line). If so, which project should I
download (hadoop project that has hdfs)?

--
Best Regards,
Karim Ahmed Awara


On Sun, Oct 6, 2013 at 5:40 PM, Ted Yu <yu...@gmail.com> wrote:

> Karim:
> If you want to debug unit tests, using Eclipse is a viable approach.
> Here is what I did the past week debugging certain part of hadoop
> (JobSubmitter in particular) through an HBase unit test.
>
> Run 'mvn install -DskipTests' to install hadoop locally
> Open the class you want to debug and place breakpoint at proper location
> Open unit test which depends on the class above and select Debug As ->
> JUnit Test
> When breakpoint hits, associate the sources.jar file in local maven repo
> with the class. In my case, the sources jar file is located
> under ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT
>
> You should be able to step through hadoop code as usual at this point.
>
> Cheers
>
>
> On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>wrote:
>
>>  Karim,
>>
>>
>>
>> I am not an experienced Hadoop programmer, but what I found was that
>> building and debugging Hadoop under Eclipse was very difficult, and I was
>> never to make it work correctly.  I suggest using the well documented
>> command-line Maven build, installing Hadoop from that build, and running it
>> normally.  Once you have that working, run your namemode or datanode daemon
>> so as to wait for a remote debugger attach before starting.  You should
>> also get comfortable with log4j, the logging framework used by Hadoop, as
>> those log files are often your best friend when trying to debug a
>> collection of services.
>>
>>
>>
>> john
>>
>>
>>
>> *From:* Karim Awara [mailto:karim.awara@kaust.edu.sa]
>> *Sent:* Sunday, October 06, 2013 5:41 AM
>> *To:* user
>> *Subject:* Hadoop 2.x with Eclipse
>>
>>
>>
>> Hi,
>>
>> I followed the instructions on how to import hadoop files to Eclipse (I
>> am using hadoop 2.1 beta).
>>
>> Currently on my machine, I have hadoop 2.1 installed.. and its source
>> code is imported on Eclipse. What I can't grasp is   how to proceed from
>> there?
>>
>> I want to modify HDFS code (blockplacement strategy).. Now building hdfs
>> project via generates errors to me (unresolved types in hadoop common).
>> and if i built successfully, how to test my modified code?
>>
>>
>>   --
>> Best Regards,
>> Karim Ahmed Awara
>>
>>
>>  ------------------------------
>>
>> This message and its contents, including attachments are intended solely
>> for the original recipient. If you are not the intended recipient or have
>> received this message in error, please notify me immediately and delete
>> this message from your computer system. Any unauthorized use or
>> distribution is prohibited. Please consider the environment before printing
>> this email.
>>
>
>

-- 

------------------------------
This message and its contents, including attachments are intended solely 
for the original recipient. If you are not the intended recipient or have 
received this message in error, please notify me immediately and delete 
this message from your computer system. Any unauthorized use or 
distribution is prohibited. Please consider the environment before printing 
this email.

Re: Hadoop 2.x with Eclipse

Posted by Karim Awara <ka...@kaust.edu.sa>.
Hi guys,

Can I just install the HDFS project and debug it? (assuming I am running a
<put> command through the command line). If so, which project should I
download (hadoop project that has hdfs)?

--
Best Regards,
Karim Ahmed Awara


On Sun, Oct 6, 2013 at 5:40 PM, Ted Yu <yu...@gmail.com> wrote:

> Karim:
> If you want to debug unit tests, using Eclipse is a viable approach.
> Here is what I did the past week debugging certain part of hadoop
> (JobSubmitter in particular) through an HBase unit test.
>
> Run 'mvn install -DskipTests' to install hadoop locally
> Open the class you want to debug and place breakpoint at proper location
> Open unit test which depends on the class above and select Debug As ->
> JUnit Test
> When breakpoint hits, associate the sources.jar file in local maven repo
> with the class. In my case, the sources jar file is located
> under ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT
>
> You should be able to step through hadoop code as usual at this point.
>
> Cheers
>
>
> On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>wrote:
>
>>  Karim,
>>
>>
>>
>> I am not an experienced Hadoop programmer, but what I found was that
>> building and debugging Hadoop under Eclipse was very difficult, and I was
>> never to make it work correctly.  I suggest using the well documented
>> command-line Maven build, installing Hadoop from that build, and running it
>> normally.  Once you have that working, run your namemode or datanode daemon
>> so as to wait for a remote debugger attach before starting.  You should
>> also get comfortable with log4j, the logging framework used by Hadoop, as
>> those log files are often your best friend when trying to debug a
>> collection of services.
>>
>>
>>
>> john
>>
>>
>>
>> *From:* Karim Awara [mailto:karim.awara@kaust.edu.sa]
>> *Sent:* Sunday, October 06, 2013 5:41 AM
>> *To:* user
>> *Subject:* Hadoop 2.x with Eclipse
>>
>>
>>
>> Hi,
>>
>> I followed the instructions on how to import hadoop files to Eclipse (I
>> am using hadoop 2.1 beta).
>>
>> Currently on my machine, I have hadoop 2.1 installed.. and its source
>> code is imported on Eclipse. What I can't grasp is   how to proceed from
>> there?
>>
>> I want to modify HDFS code (blockplacement strategy).. Now building hdfs
>> project via generates errors to me (unresolved types in hadoop common).
>> and if i built successfully, how to test my modified code?
>>
>>
>>   --
>> Best Regards,
>> Karim Ahmed Awara
>>
>>
>>  ------------------------------
>>
>> This message and its contents, including attachments are intended solely
>> for the original recipient. If you are not the intended recipient or have
>> received this message in error, please notify me immediately and delete
>> this message from your computer system. Any unauthorized use or
>> distribution is prohibited. Please consider the environment before printing
>> this email.
>>
>
>

-- 

------------------------------
This message and its contents, including attachments are intended solely 
for the original recipient. If you are not the intended recipient or have 
received this message in error, please notify me immediately and delete 
this message from your computer system. Any unauthorized use or 
distribution is prohibited. Please consider the environment before printing 
this email.

Re: Hadoop 2.x with Eclipse

Posted by Karim Awara <ka...@kaust.edu.sa>.
Hi guys,

Can I just install the HDFS project and debug it? (assuming I am running a
<put> command through the command line). If so, which project should I
download (hadoop project that has hdfs)?

--
Best Regards,
Karim Ahmed Awara


On Sun, Oct 6, 2013 at 5:40 PM, Ted Yu <yu...@gmail.com> wrote:

> Karim:
> If you want to debug unit tests, using Eclipse is a viable approach.
> Here is what I did the past week debugging certain part of hadoop
> (JobSubmitter in particular) through an HBase unit test.
>
> Run 'mvn install -DskipTests' to install hadoop locally
> Open the class you want to debug and place breakpoint at proper location
> Open unit test which depends on the class above and select Debug As ->
> JUnit Test
> When breakpoint hits, associate the sources.jar file in local maven repo
> with the class. In my case, the sources jar file is located
> under ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT
>
> You should be able to step through hadoop code as usual at this point.
>
> Cheers
>
>
> On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>wrote:
>
>>  Karim,
>>
>>
>>
>> I am not an experienced Hadoop programmer, but what I found was that
>> building and debugging Hadoop under Eclipse was very difficult, and I was
>> never to make it work correctly.  I suggest using the well documented
>> command-line Maven build, installing Hadoop from that build, and running it
>> normally.  Once you have that working, run your namemode or datanode daemon
>> so as to wait for a remote debugger attach before starting.  You should
>> also get comfortable with log4j, the logging framework used by Hadoop, as
>> those log files are often your best friend when trying to debug a
>> collection of services.
>>
>>
>>
>> john
>>
>>
>>
>> *From:* Karim Awara [mailto:karim.awara@kaust.edu.sa]
>> *Sent:* Sunday, October 06, 2013 5:41 AM
>> *To:* user
>> *Subject:* Hadoop 2.x with Eclipse
>>
>>
>>
>> Hi,
>>
>> I followed the instructions on how to import hadoop files to Eclipse (I
>> am using hadoop 2.1 beta).
>>
>> Currently on my machine, I have hadoop 2.1 installed.. and its source
>> code is imported on Eclipse. What I can't grasp is   how to proceed from
>> there?
>>
>> I want to modify HDFS code (blockplacement strategy).. Now building hdfs
>> project via generates errors to me (unresolved types in hadoop common).
>> and if i built successfully, how to test my modified code?
>>
>>
>>   --
>> Best Regards,
>> Karim Ahmed Awara
>>
>>
>>  ------------------------------
>>
>> This message and its contents, including attachments are intended solely
>> for the original recipient. If you are not the intended recipient or have
>> received this message in error, please notify me immediately and delete
>> this message from your computer system. Any unauthorized use or
>> distribution is prohibited. Please consider the environment before printing
>> this email.
>>
>
>

-- 

------------------------------
This message and its contents, including attachments are intended solely 
for the original recipient. If you are not the intended recipient or have 
received this message in error, please notify me immediately and delete 
this message from your computer system. Any unauthorized use or 
distribution is prohibited. Please consider the environment before printing 
this email.

Re: Hadoop 2.x with Eclipse

Posted by Ted Yu <yu...@gmail.com>.
Karim:
If you want to debug unit tests, using Eclipse is a viable approach.
Here is what I did the past week debugging certain part of hadoop
(JobSubmitter in particular) through an HBase unit test.

Run 'mvn install -DskipTests' to install hadoop locally
Open the class you want to debug and place breakpoint at proper location
Open unit test which depends on the class above and select Debug As ->
JUnit Test
When breakpoint hits, associate the sources.jar file in local maven repo
with the class. In my case, the sources jar file is located
under ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT

You should be able to step through hadoop code as usual at this point.

Cheers


On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>wrote:

>  Karim,****
>
> ** **
>
> I am not an experienced Hadoop programmer, but what I found was that
> building and debugging Hadoop under Eclipse was very difficult, and I was
> never to make it work correctly.  I suggest using the well documented
> command-line Maven build, installing Hadoop from that build, and running it
> normally.  Once you have that working, run your namemode or datanode daemon
> so as to wait for a remote debugger attach before starting.  You should
> also get comfortable with log4j, the logging framework used by Hadoop, as
> those log files are often your best friend when trying to debug a
> collection of services.****
>
> ** **
>
> john****
>
> ** **
>
> *From:* Karim Awara [mailto:karim.awara@kaust.edu.sa]
> *Sent:* Sunday, October 06, 2013 5:41 AM
> *To:* user
> *Subject:* Hadoop 2.x with Eclipse****
>
> ** **
>
> Hi, ****
>
> I followed the instructions on how to import hadoop files to Eclipse (I am
> using hadoop 2.1 beta). ****
>
> Currently on my machine, I have hadoop 2.1 installed.. and its source code
> is imported on Eclipse. What I can't grasp is   how to proceed from there?
> ****
>
> I want to modify HDFS code (blockplacement strategy).. Now building hdfs
> project via generates errors to me (unresolved types in hadoop common).
> and if i built successfully, how to test my modified code?****
>
>
> ****
>
> --
> Best Regards,
> Karim Ahmed Awara****
>
> ** **
>  ------------------------------
>
> This message and its contents, including attachments are intended solely
> for the original recipient. If you are not the intended recipient or have
> received this message in error, please notify me immediately and delete
> this message from your computer system. Any unauthorized use or
> distribution is prohibited. Please consider the environment before printing
> this email.****
>

Re: Hadoop 2.x with Eclipse

Posted by Ted Yu <yu...@gmail.com>.
Karim:
If you want to debug unit tests, using Eclipse is a viable approach.
Here is what I did the past week debugging certain part of hadoop
(JobSubmitter in particular) through an HBase unit test.

Run 'mvn install -DskipTests' to install hadoop locally
Open the class you want to debug and place breakpoint at proper location
Open unit test which depends on the class above and select Debug As ->
JUnit Test
When breakpoint hits, associate the sources.jar file in local maven repo
with the class. In my case, the sources jar file is located
under ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT

You should be able to step through hadoop code as usual at this point.

Cheers


On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>wrote:

>  Karim,****
>
> ** **
>
> I am not an experienced Hadoop programmer, but what I found was that
> building and debugging Hadoop under Eclipse was very difficult, and I was
> never to make it work correctly.  I suggest using the well documented
> command-line Maven build, installing Hadoop from that build, and running it
> normally.  Once you have that working, run your namemode or datanode daemon
> so as to wait for a remote debugger attach before starting.  You should
> also get comfortable with log4j, the logging framework used by Hadoop, as
> those log files are often your best friend when trying to debug a
> collection of services.****
>
> ** **
>
> john****
>
> ** **
>
> *From:* Karim Awara [mailto:karim.awara@kaust.edu.sa]
> *Sent:* Sunday, October 06, 2013 5:41 AM
> *To:* user
> *Subject:* Hadoop 2.x with Eclipse****
>
> ** **
>
> Hi, ****
>
> I followed the instructions on how to import hadoop files to Eclipse (I am
> using hadoop 2.1 beta). ****
>
> Currently on my machine, I have hadoop 2.1 installed.. and its source code
> is imported on Eclipse. What I can't grasp is   how to proceed from there?
> ****
>
> I want to modify HDFS code (blockplacement strategy).. Now building hdfs
> project via generates errors to me (unresolved types in hadoop common).
> and if i built successfully, how to test my modified code?****
>
>
> ****
>
> --
> Best Regards,
> Karim Ahmed Awara****
>
> ** **
>  ------------------------------
>
> This message and its contents, including attachments are intended solely
> for the original recipient. If you are not the intended recipient or have
> received this message in error, please notify me immediately and delete
> this message from your computer system. Any unauthorized use or
> distribution is prohibited. Please consider the environment before printing
> this email.****
>

Re: Hadoop 2.x with Eclipse

Posted by Ted Yu <yu...@gmail.com>.
Karim:
If you want to debug unit tests, using Eclipse is a viable approach.
Here is what I did the past week debugging certain part of hadoop
(JobSubmitter in particular) through an HBase unit test.

Run 'mvn install -DskipTests' to install hadoop locally
Open the class you want to debug and place breakpoint at proper location
Open unit test which depends on the class above and select Debug As ->
JUnit Test
When breakpoint hits, associate the sources.jar file in local maven repo
with the class. In my case, the sources jar file is located
under ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT

You should be able to step through hadoop code as usual at this point.

Cheers


On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>wrote:

>  Karim,****
>
> ** **
>
> I am not an experienced Hadoop programmer, but what I found was that
> building and debugging Hadoop under Eclipse was very difficult, and I was
> never to make it work correctly.  I suggest using the well documented
> command-line Maven build, installing Hadoop from that build, and running it
> normally.  Once you have that working, run your namemode or datanode daemon
> so as to wait for a remote debugger attach before starting.  You should
> also get comfortable with log4j, the logging framework used by Hadoop, as
> those log files are often your best friend when trying to debug a
> collection of services.****
>
> ** **
>
> john****
>
> ** **
>
> *From:* Karim Awara [mailto:karim.awara@kaust.edu.sa]
> *Sent:* Sunday, October 06, 2013 5:41 AM
> *To:* user
> *Subject:* Hadoop 2.x with Eclipse****
>
> ** **
>
> Hi, ****
>
> I followed the instructions on how to import hadoop files to Eclipse (I am
> using hadoop 2.1 beta). ****
>
> Currently on my machine, I have hadoop 2.1 installed.. and its source code
> is imported on Eclipse. What I can't grasp is   how to proceed from there?
> ****
>
> I want to modify HDFS code (blockplacement strategy).. Now building hdfs
> project via generates errors to me (unresolved types in hadoop common).
> and if i built successfully, how to test my modified code?****
>
>
> ****
>
> --
> Best Regards,
> Karim Ahmed Awara****
>
> ** **
>  ------------------------------
>
> This message and its contents, including attachments are intended solely
> for the original recipient. If you are not the intended recipient or have
> received this message in error, please notify me immediately and delete
> this message from your computer system. Any unauthorized use or
> distribution is prohibited. Please consider the environment before printing
> this email.****
>

Re: Hadoop 2.x with Eclipse

Posted by Ted Yu <yu...@gmail.com>.
Karim:
If you want to debug unit tests, using Eclipse is a viable approach.
Here is what I did the past week debugging certain part of hadoop
(JobSubmitter in particular) through an HBase unit test.

Run 'mvn install -DskipTests' to install hadoop locally
Open the class you want to debug and place breakpoint at proper location
Open unit test which depends on the class above and select Debug As ->
JUnit Test
When breakpoint hits, associate the sources.jar file in local maven repo
with the class. In my case, the sources jar file is located
under ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT

You should be able to step through hadoop code as usual at this point.

Cheers


On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <jo...@redpoint.net>wrote:

>  Karim,****
>
> ** **
>
> I am not an experienced Hadoop programmer, but what I found was that
> building and debugging Hadoop under Eclipse was very difficult, and I was
> never to make it work correctly.  I suggest using the well documented
> command-line Maven build, installing Hadoop from that build, and running it
> normally.  Once you have that working, run your namemode or datanode daemon
> so as to wait for a remote debugger attach before starting.  You should
> also get comfortable with log4j, the logging framework used by Hadoop, as
> those log files are often your best friend when trying to debug a
> collection of services.****
>
> ** **
>
> john****
>
> ** **
>
> *From:* Karim Awara [mailto:karim.awara@kaust.edu.sa]
> *Sent:* Sunday, October 06, 2013 5:41 AM
> *To:* user
> *Subject:* Hadoop 2.x with Eclipse****
>
> ** **
>
> Hi, ****
>
> I followed the instructions on how to import hadoop files to Eclipse (I am
> using hadoop 2.1 beta). ****
>
> Currently on my machine, I have hadoop 2.1 installed.. and its source code
> is imported on Eclipse. What I can't grasp is   how to proceed from there?
> ****
>
> I want to modify HDFS code (blockplacement strategy).. Now building hdfs
> project via generates errors to me (unresolved types in hadoop common).
> and if i built successfully, how to test my modified code?****
>
>
> ****
>
> --
> Best Regards,
> Karim Ahmed Awara****
>
> ** **
>  ------------------------------
>
> This message and its contents, including attachments are intended solely
> for the original recipient. If you are not the intended recipient or have
> received this message in error, please notify me immediately and delete
> this message from your computer system. Any unauthorized use or
> distribution is prohibited. Please consider the environment before printing
> this email.****
>

RE: Hadoop 2.x with Eclipse

Posted by John Lilley <jo...@redpoint.net>.
Karim,

I am not an experienced Hadoop programmer, but what I found was that building and debugging Hadoop under Eclipse was very difficult, and I was never to make it work correctly.  I suggest using the well documented command-line Maven build, installing Hadoop from that build, and running it normally.  Once you have that working, run your namemode or datanode daemon so as to wait for a remote debugger attach before starting.  You should also get comfortable with log4j, the logging framework used by Hadoop, as those log files are often your best friend when trying to debug a collection of services.

john

From: Karim Awara [mailto:karim.awara@kaust.edu.sa]
Sent: Sunday, October 06, 2013 5:41 AM
To: user
Subject: Hadoop 2.x with Eclipse

Hi,
I followed the instructions on how to import hadoop files to Eclipse (I am using hadoop 2.1 beta).
Currently on my machine, I have hadoop 2.1 installed.. and its source code is imported on Eclipse. What I can't grasp is   how to proceed from there?
I want to modify HDFS code (blockplacement strategy).. Now building hdfs project via generates errors to me (unresolved types in hadoop common).  and if i built successfully, how to test my modified code?

--
Best Regards,
Karim Ahmed Awara

________________________________
This message and its contents, including attachments are intended solely for the original recipient. If you are not the intended recipient or have received this message in error, please notify me immediately and delete this message from your computer system. Any unauthorized use or distribution is prohibited. Please consider the environment before printing this email.

RE: Hadoop 2.x with Eclipse

Posted by John Lilley <jo...@redpoint.net>.
Karim,

I am not an experienced Hadoop programmer, but what I found was that building and debugging Hadoop under Eclipse was very difficult, and I was never to make it work correctly.  I suggest using the well documented command-line Maven build, installing Hadoop from that build, and running it normally.  Once you have that working, run your namemode or datanode daemon so as to wait for a remote debugger attach before starting.  You should also get comfortable with log4j, the logging framework used by Hadoop, as those log files are often your best friend when trying to debug a collection of services.

john

From: Karim Awara [mailto:karim.awara@kaust.edu.sa]
Sent: Sunday, October 06, 2013 5:41 AM
To: user
Subject: Hadoop 2.x with Eclipse

Hi,
I followed the instructions on how to import hadoop files to Eclipse (I am using hadoop 2.1 beta).
Currently on my machine, I have hadoop 2.1 installed.. and its source code is imported on Eclipse. What I can't grasp is   how to proceed from there?
I want to modify HDFS code (blockplacement strategy).. Now building hdfs project via generates errors to me (unresolved types in hadoop common).  and if i built successfully, how to test my modified code?

--
Best Regards,
Karim Ahmed Awara

________________________________
This message and its contents, including attachments are intended solely for the original recipient. If you are not the intended recipient or have received this message in error, please notify me immediately and delete this message from your computer system. Any unauthorized use or distribution is prohibited. Please consider the environment before printing this email.

RE: Hadoop 2.x with Eclipse

Posted by John Lilley <jo...@redpoint.net>.
Karim,

I am not an experienced Hadoop programmer, but what I found was that building and debugging Hadoop under Eclipse was very difficult, and I was never to make it work correctly.  I suggest using the well documented command-line Maven build, installing Hadoop from that build, and running it normally.  Once you have that working, run your namemode or datanode daemon so as to wait for a remote debugger attach before starting.  You should also get comfortable with log4j, the logging framework used by Hadoop, as those log files are often your best friend when trying to debug a collection of services.

john

From: Karim Awara [mailto:karim.awara@kaust.edu.sa]
Sent: Sunday, October 06, 2013 5:41 AM
To: user
Subject: Hadoop 2.x with Eclipse

Hi,
I followed the instructions on how to import hadoop files to Eclipse (I am using hadoop 2.1 beta).
Currently on my machine, I have hadoop 2.1 installed.. and its source code is imported on Eclipse. What I can't grasp is   how to proceed from there?
I want to modify HDFS code (blockplacement strategy).. Now building hdfs project via generates errors to me (unresolved types in hadoop common).  and if i built successfully, how to test my modified code?

--
Best Regards,
Karim Ahmed Awara

________________________________
This message and its contents, including attachments are intended solely for the original recipient. If you are not the intended recipient or have received this message in error, please notify me immediately and delete this message from your computer system. Any unauthorized use or distribution is prohibited. Please consider the environment before printing this email.

RE: Hadoop 2.x with Eclipse

Posted by John Lilley <jo...@redpoint.net>.
Karim,

I am not an experienced Hadoop programmer, but what I found was that building and debugging Hadoop under Eclipse was very difficult, and I was never to make it work correctly.  I suggest using the well documented command-line Maven build, installing Hadoop from that build, and running it normally.  Once you have that working, run your namemode or datanode daemon so as to wait for a remote debugger attach before starting.  You should also get comfortable with log4j, the logging framework used by Hadoop, as those log files are often your best friend when trying to debug a collection of services.

john

From: Karim Awara [mailto:karim.awara@kaust.edu.sa]
Sent: Sunday, October 06, 2013 5:41 AM
To: user
Subject: Hadoop 2.x with Eclipse

Hi,
I followed the instructions on how to import hadoop files to Eclipse (I am using hadoop 2.1 beta).
Currently on my machine, I have hadoop 2.1 installed.. and its source code is imported on Eclipse. What I can't grasp is   how to proceed from there?
I want to modify HDFS code (blockplacement strategy).. Now building hdfs project via generates errors to me (unresolved types in hadoop common).  and if i built successfully, how to test my modified code?

--
Best Regards,
Karim Ahmed Awara

________________________________
This message and its contents, including attachments are intended solely for the original recipient. If you are not the intended recipient or have received this message in error, please notify me immediately and delete this message from your computer system. Any unauthorized use or distribution is prohibited. Please consider the environment before printing this email.