You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by Uma Maheswara Rao G <ha...@gmail.com> on 2013/07/07 17:28:11 UTC

mvn eclipse:eclipse failure on windows

Hi,

I am seeing this failure on windows while executing mvn eclipse:eclipse
command on trunk.

See the following trace:

[INFO]
------------------------------------------------------------------------
[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-eclipse-plugin:2.8
:eclipse (default-cli) on project hadoop-common: Request to merge when
'filterin
g' is not identical. Original=resource src/main/resources:
output=target/classes
, include=[], exclude=[common-version-info.properties|**/*.java],
test=false, fi
ltering=false, merging with=resource src/main/resources:
output=target/classes,
include=[common-version-info.properties], exclude=[**/*.java], test=false,
filte
ring=true -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
swit
ch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions,
please rea
d the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE
xception
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the
command

[ERROR]   mvn <goals> -rf :hadoop-common
E:\Hadoop-Trunk>

any idea for resolving it.

With 'org.apache.maven.plugins:maven-eclipse-plugin:2.6:eclipse' seems to
be no failures but  I am seeing following exception while running tests.
java.lang.UnsatisfiedLinkError:
org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
    at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
    at
org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:423)
    at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:952)
    at
org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:451)
    at
org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:282)
    at
org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:200)
    at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:696)
    at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:530)
    at
org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:401)
    at
org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:435)
    at
org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:607)
    at
org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:592)
    at
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1172)
    at
org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:895)
    at
org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:786)
    at
org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:644)
    at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:334)
    at
org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:316)
    at
org.apache.hadoop.hdfs.server.namenode.ha.TestHASafeMode.setupCluster(TestHASafeMode.java:87)

Not sure what I missed here. Any idea what could be wrong here?

Regards,
Uma

Re: mvn eclipse:eclipse failure on windows

Posted by Chris Nauroth <cn...@hortonworks.com>.
At this point, running on Windows does require hadoop.dll in the library
path and winutils.exe accessible through HADOOP_HOME.  This differs from
Linux, where libhadoop.so is a soft dependency, and you can choose to run
without it.  (There are just too many things broken in subtle ways if we
don't interact properly with the native Windows APIs.)

I wonder if there is anything we can configure in our pom.xml files so that
the library path and HADOOP_HOME requirements get injected into the Eclipse
project files automatically when you run mvn eclipse:eclipse.  This would
make it easier for Windows users to import the project correctly.  I don't
use Eclipse anymore, so I'm not deeply familiar with the options, and I
didn't find anything helpful from my quick research.

BTW, the HADOOP_HOME requirement will change at some point.  HADOOP-9422
tracks removal of this dependency, so you might want to consider setting a
watch on that issue.

https://issues.apache.org/jira/browse/HADOOP-9422

Chris Nauroth
Hortonworks
http://hortonworks.com/



On Wed, Jul 17, 2013 at 9:43 PM, Uma Maheswara Rao G
<ma...@huawei.com>wrote:

> Hi Chris,
>
> mvn test works fine for me via commandline.
> But I am trying with eclipse.
>
> Looks like, 2 things missing in eclipse, 1. /hadoop-common/target/bin is
> not coming in library path  (just checked NativeLoader is not finding
> hadoop.dll)and 2. HADOOP_HOME_DIR is not set
> On setting this 2 things, things started working for me. I am not sure if
> there is something I missed here to do things automatic. Before win merge I
> was not having such dependencies.
>
> Regards,
> Uma
>
> -----Original Message-----
> From: Chris Nauroth [mailto:cnauroth@hortonworks.com]
> Sent: 17 July 2013 05:13
> To: hdfs-dev@hadoop.apache.org
> Cc: common-dev@hadoop.apache.org
> Subject: Re: mvn eclipse:eclipse failure on windows
>
> Loading hadoop.dll in tests is supposed to work via a common shared
> maven-surefire-plugin configuration that sets the PATH environment variable
> to include the build location of the dll:
>
>
> https://github.com/apache/hadoop-common/blob/trunk/hadoop-project/pom.xml#L894
>
> (On Windows, the shared library path is controlled with PATH instead of
> LD_LIBRARY_PATH on Linux.)
>
> This configuration has been working fine in all of the dev environments
> I've seen, but I'm wondering if something different is happening in your
> environment.  Does your hadoop.dll show up in
> hadoop-common-project/hadoop-common/target/bin?  Is there anything else
> that looks unique in your environment?
>
> Also, another potential gotcha is the Windows max path length limitation of
> 260 characters.  Deeply nested project structures like Hadoop can cause
> very long paths for the built artifacts, and you might not be able to load
> the files if the full path exceeds 260 characters.  The workaround for now
> is to keep the codebase in a very short root folder.  (I use C:\hdc .)
>
> Chris Nauroth
> Hortonworks
> http://hortonworks.com/
>
>
>
> On Mon, Jul 15, 2013 at 1:07 PM, Chuan Liu <ch...@microsoft.com> wrote:
>
> > Hi Uma,
> >
> > I suggest you do a 'mvn install -DskipTests' before running 'mvn
> > eclipse:eclipse'.
> >
> > Thanks,
> > Chuan
> >
> > -----Original Message-----
> > From: Uma Maheswara Rao G [mailto:hadoop.uma@gmail.com]
> > Sent: Friday, July 12, 2013 7:42 PM
> > To: common-dev@hadoop.apache.org
> > Cc: hdfs-dev@hadoop.apache.org
> > Subject: Re: mvn eclipse:eclipse failure on windows
> >
> > HI Chris,
> >   eclipse:eclipse works but still I am seeing UnsatisfiesLinkError.
> > Explicitly I pointed java.library.path to where hadoop.dll geneated. This
> > dll generated with my clean install command only.   My pc is 64 but and
> > also set Platform=x64 while building. But does not help.
> >
> > Regards,
> > Uma
> >
> >
> >
> >
> >
> >
> > On Fri, Jul 12, 2013 at 11:45 PM, Chris Nauroth
> > <cnauroth@hortonworks.com
> > >wrote:
> >
> > > Hi Uma,
> > >
> > > I just tried getting a fresh copy of trunk and running "mvn clean
> > > install -DskipTests" followed by "mvn eclipse:eclipse -DskipTests".
> > > Everything worked fine in my environment.  Are you still seeing the
> > problem?
> > >
> > > The UnsatisfiedLinkError seems to indicate that your build couldn't
> > > access hadoop.dll for JNI method implementations.  hadoop.dll gets
> > > built as part of the hadoop-common sub-module.  Is it possible that
> > > you didn't have a complete package build for that sub-module before
> > > you started running the HDFS test?
> > >
> > > Chris Nauroth
> > > Hortonworks
> > > http://hortonworks.com/
> > >
> > >
> > >
> > > On Sun, Jul 7, 2013 at 9:08 AM, sure bhands <su...@gmail.com>
> > wrote:
> > >
> > > > I would try cleaning hadoop-maven-plugin directory from maven
> > > > repository
> > > to
> > > > rule out the stale version and then mv install followed by mvn
> > > > eclipse:eclipse before digging in to it further.
> > > >
> > > > Thanks,
> > > > Surendra
> > > >
> > > >
> > > > On Sun, Jul 7, 2013 at 8:28 AM, Uma Maheswara Rao G <
> > > hadoop.uma@gmail.com
> > > > >wrote:
> > > >
> > > > > Hi,
> > > > >
> > > > > I am seeing this failure on windows while executing mvn
> > > > > eclipse:eclipse command on trunk.
> > > > >
> > > > > See the following trace:
> > > > >
> > > > > [INFO]
> > > > >
> > > --------------------------------------------------------------------
> > > --
> > > --
> > > > > [ERROR] Failed to execute goal
> > > > > org.apache.maven.plugins:maven-eclipse-plugin:2.8
> > > > > :eclipse (default-cli) on project hadoop-common: Request to
> > > > > merge when 'filterin g' is not identical. Original=resource
> > > > > src/main/resources:
> > > > > output=target/classes
> > > > > , include=[],
> > > > > exclude=[common-version-info.properties|**/*.java],
> > > > > test=false, fi
> > > > > ltering=false, merging with=resource src/main/resources:
> > > > > output=target/classes,
> > > > > include=[common-version-info.properties], exclude=[**/*.java],
> > > > test=false,
> > > > > filte
> > > > > ring=true -> [Help 1]
> > > > > [ERROR]
> > > > > [ERROR] To see the full stack trace of the errors, re-run Maven
> > > > > with
> > > the
> > > > -e
> > > > > swit
> > > > > ch.
> > > > > [ERROR] Re-run Maven using the -X switch to enable full debug
> > logging.
> > > > > [ERROR]
> > > > > [ERROR] For more information about the errors and possible
> > > > > solutions, please rea d the following articles:
> > > > > [ERROR] [Help 1]
> > > > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE
> > > > > xception
> > > > > [ERROR]
> > > > > [ERROR] After correcting the problems, you can resume the build
> > > > > with
> > > the
> > > > > command
> > > > >
> > > > > [ERROR]   mvn <goals> -rf :hadoop-common
> > > > > E:\Hadoop-Trunk>
> > > > >
> > > > > any idea for resolving it.
> > > > >
> > > > > With 'org.apache.maven.plugins:maven-eclipse-plugin:2.6:eclipse'
> > > > > seems
> > > to
> > > > > be no failures but  I am seeing following exception while
> > > > > running
> > > tests.
> > > > > java.lang.UnsatisfiedLinkError:
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/St
> > > ri
> > > ng;I)Z
> > > > >     at
> > > > > org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native
> > > > > Method)
> > > > >     at
> > > > >
> > > org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:
> > > 42
> > > 3)
> > > > >     at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:952)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyz
> > > eS
> > > torage(Storage.java:451)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FS
> > > Im
> > > age.java:282)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead
> > > (F
> > > SImage.java:200)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNa
> > > me
> > > system.java:696)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSN
> > > am
> > > esystem.java:530)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameN
> > > od
> > > e.java:401)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.
> > > ja
> > > va:435)
> > > > >     at
> > > > >
> > > org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java
> > > :6
> > > 07)
> > > > >     at
> > > > >
> > > org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java
> > > :5
> > > 92)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameN
> > > od
> > > e.java:1172)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.
> > > ja
> > > va:895)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(Mini
> > > DF
> > > SCluster.java:786)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSClus
> > > te
> > > r.java:644)
> > > > >     at
> > > > >
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:334)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.j
> > > av
> > > a:316)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.server.namenode.ha.TestHASafeMode.setupCluste
> > > r(
> > > TestHASafeMode.java:87)
> > > > >
> > > > > Not sure what I missed here. Any idea what could be wrong here?
> > > > >
> > > > > Regards,
> > > > > Uma
> > > > >
> > > >
> > >
> >
> >
>

Re: mvn eclipse:eclipse failure on windows

Posted by Chris Nauroth <cn...@hortonworks.com>.
At this point, running on Windows does require hadoop.dll in the library
path and winutils.exe accessible through HADOOP_HOME.  This differs from
Linux, where libhadoop.so is a soft dependency, and you can choose to run
without it.  (There are just too many things broken in subtle ways if we
don't interact properly with the native Windows APIs.)

I wonder if there is anything we can configure in our pom.xml files so that
the library path and HADOOP_HOME requirements get injected into the Eclipse
project files automatically when you run mvn eclipse:eclipse.  This would
make it easier for Windows users to import the project correctly.  I don't
use Eclipse anymore, so I'm not deeply familiar with the options, and I
didn't find anything helpful from my quick research.

BTW, the HADOOP_HOME requirement will change at some point.  HADOOP-9422
tracks removal of this dependency, so you might want to consider setting a
watch on that issue.

https://issues.apache.org/jira/browse/HADOOP-9422

Chris Nauroth
Hortonworks
http://hortonworks.com/



On Wed, Jul 17, 2013 at 9:43 PM, Uma Maheswara Rao G
<ma...@huawei.com>wrote:

> Hi Chris,
>
> mvn test works fine for me via commandline.
> But I am trying with eclipse.
>
> Looks like, 2 things missing in eclipse, 1. /hadoop-common/target/bin is
> not coming in library path  (just checked NativeLoader is not finding
> hadoop.dll)and 2. HADOOP_HOME_DIR is not set
> On setting this 2 things, things started working for me. I am not sure if
> there is something I missed here to do things automatic. Before win merge I
> was not having such dependencies.
>
> Regards,
> Uma
>
> -----Original Message-----
> From: Chris Nauroth [mailto:cnauroth@hortonworks.com]
> Sent: 17 July 2013 05:13
> To: hdfs-dev@hadoop.apache.org
> Cc: common-dev@hadoop.apache.org
> Subject: Re: mvn eclipse:eclipse failure on windows
>
> Loading hadoop.dll in tests is supposed to work via a common shared
> maven-surefire-plugin configuration that sets the PATH environment variable
> to include the build location of the dll:
>
>
> https://github.com/apache/hadoop-common/blob/trunk/hadoop-project/pom.xml#L894
>
> (On Windows, the shared library path is controlled with PATH instead of
> LD_LIBRARY_PATH on Linux.)
>
> This configuration has been working fine in all of the dev environments
> I've seen, but I'm wondering if something different is happening in your
> environment.  Does your hadoop.dll show up in
> hadoop-common-project/hadoop-common/target/bin?  Is there anything else
> that looks unique in your environment?
>
> Also, another potential gotcha is the Windows max path length limitation of
> 260 characters.  Deeply nested project structures like Hadoop can cause
> very long paths for the built artifacts, and you might not be able to load
> the files if the full path exceeds 260 characters.  The workaround for now
> is to keep the codebase in a very short root folder.  (I use C:\hdc .)
>
> Chris Nauroth
> Hortonworks
> http://hortonworks.com/
>
>
>
> On Mon, Jul 15, 2013 at 1:07 PM, Chuan Liu <ch...@microsoft.com> wrote:
>
> > Hi Uma,
> >
> > I suggest you do a 'mvn install -DskipTests' before running 'mvn
> > eclipse:eclipse'.
> >
> > Thanks,
> > Chuan
> >
> > -----Original Message-----
> > From: Uma Maheswara Rao G [mailto:hadoop.uma@gmail.com]
> > Sent: Friday, July 12, 2013 7:42 PM
> > To: common-dev@hadoop.apache.org
> > Cc: hdfs-dev@hadoop.apache.org
> > Subject: Re: mvn eclipse:eclipse failure on windows
> >
> > HI Chris,
> >   eclipse:eclipse works but still I am seeing UnsatisfiesLinkError.
> > Explicitly I pointed java.library.path to where hadoop.dll geneated. This
> > dll generated with my clean install command only.   My pc is 64 but and
> > also set Platform=x64 while building. But does not help.
> >
> > Regards,
> > Uma
> >
> >
> >
> >
> >
> >
> > On Fri, Jul 12, 2013 at 11:45 PM, Chris Nauroth
> > <cnauroth@hortonworks.com
> > >wrote:
> >
> > > Hi Uma,
> > >
> > > I just tried getting a fresh copy of trunk and running "mvn clean
> > > install -DskipTests" followed by "mvn eclipse:eclipse -DskipTests".
> > > Everything worked fine in my environment.  Are you still seeing the
> > problem?
> > >
> > > The UnsatisfiedLinkError seems to indicate that your build couldn't
> > > access hadoop.dll for JNI method implementations.  hadoop.dll gets
> > > built as part of the hadoop-common sub-module.  Is it possible that
> > > you didn't have a complete package build for that sub-module before
> > > you started running the HDFS test?
> > >
> > > Chris Nauroth
> > > Hortonworks
> > > http://hortonworks.com/
> > >
> > >
> > >
> > > On Sun, Jul 7, 2013 at 9:08 AM, sure bhands <su...@gmail.com>
> > wrote:
> > >
> > > > I would try cleaning hadoop-maven-plugin directory from maven
> > > > repository
> > > to
> > > > rule out the stale version and then mv install followed by mvn
> > > > eclipse:eclipse before digging in to it further.
> > > >
> > > > Thanks,
> > > > Surendra
> > > >
> > > >
> > > > On Sun, Jul 7, 2013 at 8:28 AM, Uma Maheswara Rao G <
> > > hadoop.uma@gmail.com
> > > > >wrote:
> > > >
> > > > > Hi,
> > > > >
> > > > > I am seeing this failure on windows while executing mvn
> > > > > eclipse:eclipse command on trunk.
> > > > >
> > > > > See the following trace:
> > > > >
> > > > > [INFO]
> > > > >
> > > --------------------------------------------------------------------
> > > --
> > > --
> > > > > [ERROR] Failed to execute goal
> > > > > org.apache.maven.plugins:maven-eclipse-plugin:2.8
> > > > > :eclipse (default-cli) on project hadoop-common: Request to
> > > > > merge when 'filterin g' is not identical. Original=resource
> > > > > src/main/resources:
> > > > > output=target/classes
> > > > > , include=[],
> > > > > exclude=[common-version-info.properties|**/*.java],
> > > > > test=false, fi
> > > > > ltering=false, merging with=resource src/main/resources:
> > > > > output=target/classes,
> > > > > include=[common-version-info.properties], exclude=[**/*.java],
> > > > test=false,
> > > > > filte
> > > > > ring=true -> [Help 1]
> > > > > [ERROR]
> > > > > [ERROR] To see the full stack trace of the errors, re-run Maven
> > > > > with
> > > the
> > > > -e
> > > > > swit
> > > > > ch.
> > > > > [ERROR] Re-run Maven using the -X switch to enable full debug
> > logging.
> > > > > [ERROR]
> > > > > [ERROR] For more information about the errors and possible
> > > > > solutions, please rea d the following articles:
> > > > > [ERROR] [Help 1]
> > > > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE
> > > > > xception
> > > > > [ERROR]
> > > > > [ERROR] After correcting the problems, you can resume the build
> > > > > with
> > > the
> > > > > command
> > > > >
> > > > > [ERROR]   mvn <goals> -rf :hadoop-common
> > > > > E:\Hadoop-Trunk>
> > > > >
> > > > > any idea for resolving it.
> > > > >
> > > > > With 'org.apache.maven.plugins:maven-eclipse-plugin:2.6:eclipse'
> > > > > seems
> > > to
> > > > > be no failures but  I am seeing following exception while
> > > > > running
> > > tests.
> > > > > java.lang.UnsatisfiedLinkError:
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/St
> > > ri
> > > ng;I)Z
> > > > >     at
> > > > > org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native
> > > > > Method)
> > > > >     at
> > > > >
> > > org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:
> > > 42
> > > 3)
> > > > >     at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:952)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyz
> > > eS
> > > torage(Storage.java:451)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FS
> > > Im
> > > age.java:282)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead
> > > (F
> > > SImage.java:200)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNa
> > > me
> > > system.java:696)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSN
> > > am
> > > esystem.java:530)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameN
> > > od
> > > e.java:401)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.
> > > ja
> > > va:435)
> > > > >     at
> > > > >
> > > org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java
> > > :6
> > > 07)
> > > > >     at
> > > > >
> > > org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java
> > > :5
> > > 92)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameN
> > > od
> > > e.java:1172)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.
> > > ja
> > > va:895)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(Mini
> > > DF
> > > SCluster.java:786)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSClus
> > > te
> > > r.java:644)
> > > > >     at
> > > > >
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:334)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.j
> > > av
> > > a:316)
> > > > >     at
> > > > >
> > > > >
> > > >
> > > org.apache.hadoop.hdfs.server.namenode.ha.TestHASafeMode.setupCluste
> > > r(
> > > TestHASafeMode.java:87)
> > > > >
> > > > > Not sure what I missed here. Any idea what could be wrong here?
> > > > >
> > > > > Regards,
> > > > > Uma
> > > > >
> > > >
> > >
> >
> >
>

RE: mvn eclipse:eclipse failure on windows

Posted by Uma Maheswara Rao G <ma...@huawei.com>.
Hi Chris,

mvn test works fine for me via commandline.
But I am trying with eclipse.

Looks like, 2 things missing in eclipse, 1. /hadoop-common/target/bin is not coming in library path  (just checked NativeLoader is not finding hadoop.dll)and 2. HADOOP_HOME_DIR is not set
On setting this 2 things, things started working for me. I am not sure if there is something I missed here to do things automatic. Before win merge I was not having such dependencies.

Regards,
Uma 

-----Original Message-----
From: Chris Nauroth [mailto:cnauroth@hortonworks.com] 
Sent: 17 July 2013 05:13
To: hdfs-dev@hadoop.apache.org
Cc: common-dev@hadoop.apache.org
Subject: Re: mvn eclipse:eclipse failure on windows

Loading hadoop.dll in tests is supposed to work via a common shared maven-surefire-plugin configuration that sets the PATH environment variable to include the build location of the dll:

https://github.com/apache/hadoop-common/blob/trunk/hadoop-project/pom.xml#L894

(On Windows, the shared library path is controlled with PATH instead of LD_LIBRARY_PATH on Linux.)

This configuration has been working fine in all of the dev environments I've seen, but I'm wondering if something different is happening in your environment.  Does your hadoop.dll show up in hadoop-common-project/hadoop-common/target/bin?  Is there anything else that looks unique in your environment?

Also, another potential gotcha is the Windows max path length limitation of
260 characters.  Deeply nested project structures like Hadoop can cause very long paths for the built artifacts, and you might not be able to load the files if the full path exceeds 260 characters.  The workaround for now is to keep the codebase in a very short root folder.  (I use C:\hdc .)

Chris Nauroth
Hortonworks
http://hortonworks.com/



On Mon, Jul 15, 2013 at 1:07 PM, Chuan Liu <ch...@microsoft.com> wrote:

> Hi Uma,
>
> I suggest you do a 'mvn install -DskipTests' before running 'mvn 
> eclipse:eclipse'.
>
> Thanks,
> Chuan
>
> -----Original Message-----
> From: Uma Maheswara Rao G [mailto:hadoop.uma@gmail.com]
> Sent: Friday, July 12, 2013 7:42 PM
> To: common-dev@hadoop.apache.org
> Cc: hdfs-dev@hadoop.apache.org
> Subject: Re: mvn eclipse:eclipse failure on windows
>
> HI Chris,
>   eclipse:eclipse works but still I am seeing UnsatisfiesLinkError.
> Explicitly I pointed java.library.path to where hadoop.dll geneated. This
> dll generated with my clean install command only.   My pc is 64 but and
> also set Platform=x64 while building. But does not help.
>
> Regards,
> Uma
>
>
>
>
>
>
> On Fri, Jul 12, 2013 at 11:45 PM, Chris Nauroth 
> <cnauroth@hortonworks.com
> >wrote:
>
> > Hi Uma,
> >
> > I just tried getting a fresh copy of trunk and running "mvn clean 
> > install -DskipTests" followed by "mvn eclipse:eclipse -DskipTests".
> > Everything worked fine in my environment.  Are you still seeing the
> problem?
> >
> > The UnsatisfiedLinkError seems to indicate that your build couldn't 
> > access hadoop.dll for JNI method implementations.  hadoop.dll gets 
> > built as part of the hadoop-common sub-module.  Is it possible that 
> > you didn't have a complete package build for that sub-module before 
> > you started running the HDFS test?
> >
> > Chris Nauroth
> > Hortonworks
> > http://hortonworks.com/
> >
> >
> >
> > On Sun, Jul 7, 2013 at 9:08 AM, sure bhands <su...@gmail.com>
> wrote:
> >
> > > I would try cleaning hadoop-maven-plugin directory from maven 
> > > repository
> > to
> > > rule out the stale version and then mv install followed by mvn 
> > > eclipse:eclipse before digging in to it further.
> > >
> > > Thanks,
> > > Surendra
> > >
> > >
> > > On Sun, Jul 7, 2013 at 8:28 AM, Uma Maheswara Rao G <
> > hadoop.uma@gmail.com
> > > >wrote:
> > >
> > > > Hi,
> > > >
> > > > I am seeing this failure on windows while executing mvn 
> > > > eclipse:eclipse command on trunk.
> > > >
> > > > See the following trace:
> > > >
> > > > [INFO]
> > > >
> > --------------------------------------------------------------------
> > --
> > --
> > > > [ERROR] Failed to execute goal
> > > > org.apache.maven.plugins:maven-eclipse-plugin:2.8
> > > > :eclipse (default-cli) on project hadoop-common: Request to 
> > > > merge when 'filterin g' is not identical. Original=resource
> > > > src/main/resources:
> > > > output=target/classes
> > > > , include=[], 
> > > > exclude=[common-version-info.properties|**/*.java],
> > > > test=false, fi
> > > > ltering=false, merging with=resource src/main/resources:
> > > > output=target/classes,
> > > > include=[common-version-info.properties], exclude=[**/*.java],
> > > test=false,
> > > > filte
> > > > ring=true -> [Help 1]
> > > > [ERROR]
> > > > [ERROR] To see the full stack trace of the errors, re-run Maven 
> > > > with
> > the
> > > -e
> > > > swit
> > > > ch.
> > > > [ERROR] Re-run Maven using the -X switch to enable full debug
> logging.
> > > > [ERROR]
> > > > [ERROR] For more information about the errors and possible 
> > > > solutions, please rea d the following articles:
> > > > [ERROR] [Help 1]
> > > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE
> > > > xception
> > > > [ERROR]
> > > > [ERROR] After correcting the problems, you can resume the build 
> > > > with
> > the
> > > > command
> > > >
> > > > [ERROR]   mvn <goals> -rf :hadoop-common
> > > > E:\Hadoop-Trunk>
> > > >
> > > > any idea for resolving it.
> > > >
> > > > With 'org.apache.maven.plugins:maven-eclipse-plugin:2.6:eclipse'
> > > > seems
> > to
> > > > be no failures but  I am seeing following exception while 
> > > > running
> > tests.
> > > > java.lang.UnsatisfiedLinkError:
> > > >
> > > >
> > >
> > org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/St
> > ri
> > ng;I)Z
> > > >     at
> > > > org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native
> > > > Method)
> > > >     at
> > > >
> > org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:
> > 42
> > 3)
> > > >     at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:952)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyz
> > eS
> > torage(Storage.java:451)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FS
> > Im
> > age.java:282)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead
> > (F
> > SImage.java:200)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNa
> > me
> > system.java:696)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSN
> > am
> > esystem.java:530)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameN
> > od
> > e.java:401)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.
> > ja
> > va:435)
> > > >     at
> > > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java
> > :6
> > 07)
> > > >     at
> > > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java
> > :5
> > 92)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameN
> > od
> > e.java:1172)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.
> > ja
> > va:895)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(Mini
> > DF
> > SCluster.java:786)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSClus
> > te
> > r.java:644)
> > > >     at
> > > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:334)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.j
> > av
> > a:316)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.ha.TestHASafeMode.setupCluste
> > r(
> > TestHASafeMode.java:87)
> > > >
> > > > Not sure what I missed here. Any idea what could be wrong here?
> > > >
> > > > Regards,
> > > > Uma
> > > >
> > >
> >
>
>

RE: mvn eclipse:eclipse failure on windows

Posted by Uma Maheswara Rao G <ma...@huawei.com>.
Hi Chris,

mvn test works fine for me via commandline.
But I am trying with eclipse.

Looks like, 2 things missing in eclipse, 1. /hadoop-common/target/bin is not coming in library path  (just checked NativeLoader is not finding hadoop.dll)and 2. HADOOP_HOME_DIR is not set
On setting this 2 things, things started working for me. I am not sure if there is something I missed here to do things automatic. Before win merge I was not having such dependencies.

Regards,
Uma 

-----Original Message-----
From: Chris Nauroth [mailto:cnauroth@hortonworks.com] 
Sent: 17 July 2013 05:13
To: hdfs-dev@hadoop.apache.org
Cc: common-dev@hadoop.apache.org
Subject: Re: mvn eclipse:eclipse failure on windows

Loading hadoop.dll in tests is supposed to work via a common shared maven-surefire-plugin configuration that sets the PATH environment variable to include the build location of the dll:

https://github.com/apache/hadoop-common/blob/trunk/hadoop-project/pom.xml#L894

(On Windows, the shared library path is controlled with PATH instead of LD_LIBRARY_PATH on Linux.)

This configuration has been working fine in all of the dev environments I've seen, but I'm wondering if something different is happening in your environment.  Does your hadoop.dll show up in hadoop-common-project/hadoop-common/target/bin?  Is there anything else that looks unique in your environment?

Also, another potential gotcha is the Windows max path length limitation of
260 characters.  Deeply nested project structures like Hadoop can cause very long paths for the built artifacts, and you might not be able to load the files if the full path exceeds 260 characters.  The workaround for now is to keep the codebase in a very short root folder.  (I use C:\hdc .)

Chris Nauroth
Hortonworks
http://hortonworks.com/



On Mon, Jul 15, 2013 at 1:07 PM, Chuan Liu <ch...@microsoft.com> wrote:

> Hi Uma,
>
> I suggest you do a 'mvn install -DskipTests' before running 'mvn 
> eclipse:eclipse'.
>
> Thanks,
> Chuan
>
> -----Original Message-----
> From: Uma Maheswara Rao G [mailto:hadoop.uma@gmail.com]
> Sent: Friday, July 12, 2013 7:42 PM
> To: common-dev@hadoop.apache.org
> Cc: hdfs-dev@hadoop.apache.org
> Subject: Re: mvn eclipse:eclipse failure on windows
>
> HI Chris,
>   eclipse:eclipse works but still I am seeing UnsatisfiesLinkError.
> Explicitly I pointed java.library.path to where hadoop.dll geneated. This
> dll generated with my clean install command only.   My pc is 64 but and
> also set Platform=x64 while building. But does not help.
>
> Regards,
> Uma
>
>
>
>
>
>
> On Fri, Jul 12, 2013 at 11:45 PM, Chris Nauroth 
> <cnauroth@hortonworks.com
> >wrote:
>
> > Hi Uma,
> >
> > I just tried getting a fresh copy of trunk and running "mvn clean 
> > install -DskipTests" followed by "mvn eclipse:eclipse -DskipTests".
> > Everything worked fine in my environment.  Are you still seeing the
> problem?
> >
> > The UnsatisfiedLinkError seems to indicate that your build couldn't 
> > access hadoop.dll for JNI method implementations.  hadoop.dll gets 
> > built as part of the hadoop-common sub-module.  Is it possible that 
> > you didn't have a complete package build for that sub-module before 
> > you started running the HDFS test?
> >
> > Chris Nauroth
> > Hortonworks
> > http://hortonworks.com/
> >
> >
> >
> > On Sun, Jul 7, 2013 at 9:08 AM, sure bhands <su...@gmail.com>
> wrote:
> >
> > > I would try cleaning hadoop-maven-plugin directory from maven 
> > > repository
> > to
> > > rule out the stale version and then mv install followed by mvn 
> > > eclipse:eclipse before digging in to it further.
> > >
> > > Thanks,
> > > Surendra
> > >
> > >
> > > On Sun, Jul 7, 2013 at 8:28 AM, Uma Maheswara Rao G <
> > hadoop.uma@gmail.com
> > > >wrote:
> > >
> > > > Hi,
> > > >
> > > > I am seeing this failure on windows while executing mvn 
> > > > eclipse:eclipse command on trunk.
> > > >
> > > > See the following trace:
> > > >
> > > > [INFO]
> > > >
> > --------------------------------------------------------------------
> > --
> > --
> > > > [ERROR] Failed to execute goal
> > > > org.apache.maven.plugins:maven-eclipse-plugin:2.8
> > > > :eclipse (default-cli) on project hadoop-common: Request to 
> > > > merge when 'filterin g' is not identical. Original=resource
> > > > src/main/resources:
> > > > output=target/classes
> > > > , include=[], 
> > > > exclude=[common-version-info.properties|**/*.java],
> > > > test=false, fi
> > > > ltering=false, merging with=resource src/main/resources:
> > > > output=target/classes,
> > > > include=[common-version-info.properties], exclude=[**/*.java],
> > > test=false,
> > > > filte
> > > > ring=true -> [Help 1]
> > > > [ERROR]
> > > > [ERROR] To see the full stack trace of the errors, re-run Maven 
> > > > with
> > the
> > > -e
> > > > swit
> > > > ch.
> > > > [ERROR] Re-run Maven using the -X switch to enable full debug
> logging.
> > > > [ERROR]
> > > > [ERROR] For more information about the errors and possible 
> > > > solutions, please rea d the following articles:
> > > > [ERROR] [Help 1]
> > > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE
> > > > xception
> > > > [ERROR]
> > > > [ERROR] After correcting the problems, you can resume the build 
> > > > with
> > the
> > > > command
> > > >
> > > > [ERROR]   mvn <goals> -rf :hadoop-common
> > > > E:\Hadoop-Trunk>
> > > >
> > > > any idea for resolving it.
> > > >
> > > > With 'org.apache.maven.plugins:maven-eclipse-plugin:2.6:eclipse'
> > > > seems
> > to
> > > > be no failures but  I am seeing following exception while 
> > > > running
> > tests.
> > > > java.lang.UnsatisfiedLinkError:
> > > >
> > > >
> > >
> > org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/St
> > ri
> > ng;I)Z
> > > >     at
> > > > org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native
> > > > Method)
> > > >     at
> > > >
> > org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:
> > 42
> > 3)
> > > >     at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:952)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyz
> > eS
> > torage(Storage.java:451)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FS
> > Im
> > age.java:282)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead
> > (F
> > SImage.java:200)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNa
> > me
> > system.java:696)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSN
> > am
> > esystem.java:530)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameN
> > od
> > e.java:401)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.
> > ja
> > va:435)
> > > >     at
> > > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java
> > :6
> > 07)
> > > >     at
> > > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java
> > :5
> > 92)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameN
> > od
> > e.java:1172)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.
> > ja
> > va:895)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(Mini
> > DF
> > SCluster.java:786)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSClus
> > te
> > r.java:644)
> > > >     at
> > > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:334)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.j
> > av
> > a:316)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.ha.TestHASafeMode.setupCluste
> > r(
> > TestHASafeMode.java:87)
> > > >
> > > > Not sure what I missed here. Any idea what could be wrong here?
> > > >
> > > > Regards,
> > > > Uma
> > > >
> > >
> >
>
>

Re: mvn eclipse:eclipse failure on windows

Posted by Chris Nauroth <cn...@hortonworks.com>.
Loading hadoop.dll in tests is supposed to work via a common shared
maven-surefire-plugin configuration that sets the PATH environment variable
to include the build location of the dll:

https://github.com/apache/hadoop-common/blob/trunk/hadoop-project/pom.xml#L894

(On Windows, the shared library path is controlled with PATH instead of
LD_LIBRARY_PATH on Linux.)

This configuration has been working fine in all of the dev environments
I've seen, but I'm wondering if something different is happening in your
environment.  Does your hadoop.dll show up in
hadoop-common-project/hadoop-common/target/bin?  Is there anything else
that looks unique in your environment?

Also, another potential gotcha is the Windows max path length limitation of
260 characters.  Deeply nested project structures like Hadoop can cause
very long paths for the built artifacts, and you might not be able to load
the files if the full path exceeds 260 characters.  The workaround for now
is to keep the codebase in a very short root folder.  (I use C:\hdc .)

Chris Nauroth
Hortonworks
http://hortonworks.com/



On Mon, Jul 15, 2013 at 1:07 PM, Chuan Liu <ch...@microsoft.com> wrote:

> Hi Uma,
>
> I suggest you do a 'mvn install -DskipTests' before running 'mvn
> eclipse:eclipse'.
>
> Thanks,
> Chuan
>
> -----Original Message-----
> From: Uma Maheswara Rao G [mailto:hadoop.uma@gmail.com]
> Sent: Friday, July 12, 2013 7:42 PM
> To: common-dev@hadoop.apache.org
> Cc: hdfs-dev@hadoop.apache.org
> Subject: Re: mvn eclipse:eclipse failure on windows
>
> HI Chris,
>   eclipse:eclipse works but still I am seeing UnsatisfiesLinkError.
> Explicitly I pointed java.library.path to where hadoop.dll geneated. This
> dll generated with my clean install command only.   My pc is 64 but and
> also set Platform=x64 while building. But does not help.
>
> Regards,
> Uma
>
>
>
>
>
>
> On Fri, Jul 12, 2013 at 11:45 PM, Chris Nauroth <cnauroth@hortonworks.com
> >wrote:
>
> > Hi Uma,
> >
> > I just tried getting a fresh copy of trunk and running "mvn clean
> > install -DskipTests" followed by "mvn eclipse:eclipse -DskipTests".
> > Everything worked fine in my environment.  Are you still seeing the
> problem?
> >
> > The UnsatisfiedLinkError seems to indicate that your build couldn't
> > access hadoop.dll for JNI method implementations.  hadoop.dll gets
> > built as part of the hadoop-common sub-module.  Is it possible that
> > you didn't have a complete package build for that sub-module before
> > you started running the HDFS test?
> >
> > Chris Nauroth
> > Hortonworks
> > http://hortonworks.com/
> >
> >
> >
> > On Sun, Jul 7, 2013 at 9:08 AM, sure bhands <su...@gmail.com>
> wrote:
> >
> > > I would try cleaning hadoop-maven-plugin directory from maven
> > > repository
> > to
> > > rule out the stale version and then mv install followed by mvn
> > > eclipse:eclipse before digging in to it further.
> > >
> > > Thanks,
> > > Surendra
> > >
> > >
> > > On Sun, Jul 7, 2013 at 8:28 AM, Uma Maheswara Rao G <
> > hadoop.uma@gmail.com
> > > >wrote:
> > >
> > > > Hi,
> > > >
> > > > I am seeing this failure on windows while executing mvn
> > > > eclipse:eclipse command on trunk.
> > > >
> > > > See the following trace:
> > > >
> > > > [INFO]
> > > >
> > ----------------------------------------------------------------------
> > --
> > > > [ERROR] Failed to execute goal
> > > > org.apache.maven.plugins:maven-eclipse-plugin:2.8
> > > > :eclipse (default-cli) on project hadoop-common: Request to merge
> > > > when 'filterin g' is not identical. Original=resource
> > > > src/main/resources:
> > > > output=target/classes
> > > > , include=[], exclude=[common-version-info.properties|**/*.java],
> > > > test=false, fi
> > > > ltering=false, merging with=resource src/main/resources:
> > > > output=target/classes,
> > > > include=[common-version-info.properties], exclude=[**/*.java],
> > > test=false,
> > > > filte
> > > > ring=true -> [Help 1]
> > > > [ERROR]
> > > > [ERROR] To see the full stack trace of the errors, re-run Maven
> > > > with
> > the
> > > -e
> > > > swit
> > > > ch.
> > > > [ERROR] Re-run Maven using the -X switch to enable full debug
> logging.
> > > > [ERROR]
> > > > [ERROR] For more information about the errors and possible
> > > > solutions, please rea d the following articles:
> > > > [ERROR] [Help 1]
> > > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE
> > > > xception
> > > > [ERROR]
> > > > [ERROR] After correcting the problems, you can resume the build
> > > > with
> > the
> > > > command
> > > >
> > > > [ERROR]   mvn <goals> -rf :hadoop-common
> > > > E:\Hadoop-Trunk>
> > > >
> > > > any idea for resolving it.
> > > >
> > > > With 'org.apache.maven.plugins:maven-eclipse-plugin:2.6:eclipse'
> > > > seems
> > to
> > > > be no failures but  I am seeing following exception while running
> > tests.
> > > > java.lang.UnsatisfiedLinkError:
> > > >
> > > >
> > >
> > org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/Stri
> > ng;I)Z
> > > >     at
> > > > org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native
> > > > Method)
> > > >     at
> > > >
> > org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:42
> > 3)
> > > >     at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:952)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeS
> > torage(Storage.java:451)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSIm
> > age.java:282)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(F
> > SImage.java:200)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSName
> > system.java:696)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNam
> > esystem.java:530)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNod
> > e.java:401)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.ja
> > va:435)
> > > >     at
> > > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:6
> > 07)
> > > >     at
> > > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:5
> > 92)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNod
> > e.java:1172)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.ja
> > va:895)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDF
> > SCluster.java:786)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluste
> > r.java:644)
> > > >     at
> > > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:334)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.jav
> > a:316)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.ha.TestHASafeMode.setupCluster(
> > TestHASafeMode.java:87)
> > > >
> > > > Not sure what I missed here. Any idea what could be wrong here?
> > > >
> > > > Regards,
> > > > Uma
> > > >
> > >
> >
>
>

Re: mvn eclipse:eclipse failure on windows

Posted by Chris Nauroth <cn...@hortonworks.com>.
Loading hadoop.dll in tests is supposed to work via a common shared
maven-surefire-plugin configuration that sets the PATH environment variable
to include the build location of the dll:

https://github.com/apache/hadoop-common/blob/trunk/hadoop-project/pom.xml#L894

(On Windows, the shared library path is controlled with PATH instead of
LD_LIBRARY_PATH on Linux.)

This configuration has been working fine in all of the dev environments
I've seen, but I'm wondering if something different is happening in your
environment.  Does your hadoop.dll show up in
hadoop-common-project/hadoop-common/target/bin?  Is there anything else
that looks unique in your environment?

Also, another potential gotcha is the Windows max path length limitation of
260 characters.  Deeply nested project structures like Hadoop can cause
very long paths for the built artifacts, and you might not be able to load
the files if the full path exceeds 260 characters.  The workaround for now
is to keep the codebase in a very short root folder.  (I use C:\hdc .)

Chris Nauroth
Hortonworks
http://hortonworks.com/



On Mon, Jul 15, 2013 at 1:07 PM, Chuan Liu <ch...@microsoft.com> wrote:

> Hi Uma,
>
> I suggest you do a 'mvn install -DskipTests' before running 'mvn
> eclipse:eclipse'.
>
> Thanks,
> Chuan
>
> -----Original Message-----
> From: Uma Maheswara Rao G [mailto:hadoop.uma@gmail.com]
> Sent: Friday, July 12, 2013 7:42 PM
> To: common-dev@hadoop.apache.org
> Cc: hdfs-dev@hadoop.apache.org
> Subject: Re: mvn eclipse:eclipse failure on windows
>
> HI Chris,
>   eclipse:eclipse works but still I am seeing UnsatisfiesLinkError.
> Explicitly I pointed java.library.path to where hadoop.dll geneated. This
> dll generated with my clean install command only.   My pc is 64 but and
> also set Platform=x64 while building. But does not help.
>
> Regards,
> Uma
>
>
>
>
>
>
> On Fri, Jul 12, 2013 at 11:45 PM, Chris Nauroth <cnauroth@hortonworks.com
> >wrote:
>
> > Hi Uma,
> >
> > I just tried getting a fresh copy of trunk and running "mvn clean
> > install -DskipTests" followed by "mvn eclipse:eclipse -DskipTests".
> > Everything worked fine in my environment.  Are you still seeing the
> problem?
> >
> > The UnsatisfiedLinkError seems to indicate that your build couldn't
> > access hadoop.dll for JNI method implementations.  hadoop.dll gets
> > built as part of the hadoop-common sub-module.  Is it possible that
> > you didn't have a complete package build for that sub-module before
> > you started running the HDFS test?
> >
> > Chris Nauroth
> > Hortonworks
> > http://hortonworks.com/
> >
> >
> >
> > On Sun, Jul 7, 2013 at 9:08 AM, sure bhands <su...@gmail.com>
> wrote:
> >
> > > I would try cleaning hadoop-maven-plugin directory from maven
> > > repository
> > to
> > > rule out the stale version and then mv install followed by mvn
> > > eclipse:eclipse before digging in to it further.
> > >
> > > Thanks,
> > > Surendra
> > >
> > >
> > > On Sun, Jul 7, 2013 at 8:28 AM, Uma Maheswara Rao G <
> > hadoop.uma@gmail.com
> > > >wrote:
> > >
> > > > Hi,
> > > >
> > > > I am seeing this failure on windows while executing mvn
> > > > eclipse:eclipse command on trunk.
> > > >
> > > > See the following trace:
> > > >
> > > > [INFO]
> > > >
> > ----------------------------------------------------------------------
> > --
> > > > [ERROR] Failed to execute goal
> > > > org.apache.maven.plugins:maven-eclipse-plugin:2.8
> > > > :eclipse (default-cli) on project hadoop-common: Request to merge
> > > > when 'filterin g' is not identical. Original=resource
> > > > src/main/resources:
> > > > output=target/classes
> > > > , include=[], exclude=[common-version-info.properties|**/*.java],
> > > > test=false, fi
> > > > ltering=false, merging with=resource src/main/resources:
> > > > output=target/classes,
> > > > include=[common-version-info.properties], exclude=[**/*.java],
> > > test=false,
> > > > filte
> > > > ring=true -> [Help 1]
> > > > [ERROR]
> > > > [ERROR] To see the full stack trace of the errors, re-run Maven
> > > > with
> > the
> > > -e
> > > > swit
> > > > ch.
> > > > [ERROR] Re-run Maven using the -X switch to enable full debug
> logging.
> > > > [ERROR]
> > > > [ERROR] For more information about the errors and possible
> > > > solutions, please rea d the following articles:
> > > > [ERROR] [Help 1]
> > > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE
> > > > xception
> > > > [ERROR]
> > > > [ERROR] After correcting the problems, you can resume the build
> > > > with
> > the
> > > > command
> > > >
> > > > [ERROR]   mvn <goals> -rf :hadoop-common
> > > > E:\Hadoop-Trunk>
> > > >
> > > > any idea for resolving it.
> > > >
> > > > With 'org.apache.maven.plugins:maven-eclipse-plugin:2.6:eclipse'
> > > > seems
> > to
> > > > be no failures but  I am seeing following exception while running
> > tests.
> > > > java.lang.UnsatisfiedLinkError:
> > > >
> > > >
> > >
> > org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/Stri
> > ng;I)Z
> > > >     at
> > > > org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native
> > > > Method)
> > > >     at
> > > >
> > org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:42
> > 3)
> > > >     at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:952)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeS
> > torage(Storage.java:451)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSIm
> > age.java:282)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(F
> > SImage.java:200)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSName
> > system.java:696)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNam
> > esystem.java:530)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNod
> > e.java:401)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.ja
> > va:435)
> > > >     at
> > > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:6
> > 07)
> > > >     at
> > > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:5
> > 92)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNod
> > e.java:1172)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.ja
> > va:895)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDF
> > SCluster.java:786)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluste
> > r.java:644)
> > > >     at
> > > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:334)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.jav
> > a:316)
> > > >     at
> > > >
> > > >
> > >
> > org.apache.hadoop.hdfs.server.namenode.ha.TestHASafeMode.setupCluster(
> > TestHASafeMode.java:87)
> > > >
> > > > Not sure what I missed here. Any idea what could be wrong here?
> > > >
> > > > Regards,
> > > > Uma
> > > >
> > >
> >
>
>

RE: mvn eclipse:eclipse failure on windows

Posted by Chuan Liu <ch...@microsoft.com>.
Hi Uma,

I suggest you do a 'mvn install -DskipTests' before running 'mvn eclipse:eclipse'.

Thanks,
Chuan

-----Original Message-----
From: Uma Maheswara Rao G [mailto:hadoop.uma@gmail.com] 
Sent: Friday, July 12, 2013 7:42 PM
To: common-dev@hadoop.apache.org
Cc: hdfs-dev@hadoop.apache.org
Subject: Re: mvn eclipse:eclipse failure on windows

HI Chris,
  eclipse:eclipse works but still I am seeing UnsatisfiesLinkError.
Explicitly I pointed java.library.path to where hadoop.dll geneated. This
dll generated with my clean install command only.   My pc is 64 but and
also set Platform=x64 while building. But does not help.

Regards,
Uma






On Fri, Jul 12, 2013 at 11:45 PM, Chris Nauroth <cn...@hortonworks.com>wrote:

> Hi Uma,
>
> I just tried getting a fresh copy of trunk and running "mvn clean 
> install -DskipTests" followed by "mvn eclipse:eclipse -DskipTests".  
> Everything worked fine in my environment.  Are you still seeing the problem?
>
> The UnsatisfiedLinkError seems to indicate that your build couldn't 
> access hadoop.dll for JNI method implementations.  hadoop.dll gets 
> built as part of the hadoop-common sub-module.  Is it possible that 
> you didn't have a complete package build for that sub-module before 
> you started running the HDFS test?
>
> Chris Nauroth
> Hortonworks
> http://hortonworks.com/
>
>
>
> On Sun, Jul 7, 2013 at 9:08 AM, sure bhands <su...@gmail.com> wrote:
>
> > I would try cleaning hadoop-maven-plugin directory from maven 
> > repository
> to
> > rule out the stale version and then mv install followed by mvn 
> > eclipse:eclipse before digging in to it further.
> >
> > Thanks,
> > Surendra
> >
> >
> > On Sun, Jul 7, 2013 at 8:28 AM, Uma Maheswara Rao G <
> hadoop.uma@gmail.com
> > >wrote:
> >
> > > Hi,
> > >
> > > I am seeing this failure on windows while executing mvn 
> > > eclipse:eclipse command on trunk.
> > >
> > > See the following trace:
> > >
> > > [INFO]
> > >
> ----------------------------------------------------------------------
> --
> > > [ERROR] Failed to execute goal
> > > org.apache.maven.plugins:maven-eclipse-plugin:2.8
> > > :eclipse (default-cli) on project hadoop-common: Request to merge 
> > > when 'filterin g' is not identical. Original=resource 
> > > src/main/resources:
> > > output=target/classes
> > > , include=[], exclude=[common-version-info.properties|**/*.java],
> > > test=false, fi
> > > ltering=false, merging with=resource src/main/resources:
> > > output=target/classes,
> > > include=[common-version-info.properties], exclude=[**/*.java],
> > test=false,
> > > filte
> > > ring=true -> [Help 1]
> > > [ERROR]
> > > [ERROR] To see the full stack trace of the errors, re-run Maven 
> > > with
> the
> > -e
> > > swit
> > > ch.
> > > [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> > > [ERROR]
> > > [ERROR] For more information about the errors and possible 
> > > solutions, please rea d the following articles:
> > > [ERROR] [Help 1]
> > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE
> > > xception
> > > [ERROR]
> > > [ERROR] After correcting the problems, you can resume the build 
> > > with
> the
> > > command
> > >
> > > [ERROR]   mvn <goals> -rf :hadoop-common
> > > E:\Hadoop-Trunk>
> > >
> > > any idea for resolving it.
> > >
> > > With 'org.apache.maven.plugins:maven-eclipse-plugin:2.6:eclipse' 
> > > seems
> to
> > > be no failures but  I am seeing following exception while running
> tests.
> > > java.lang.UnsatisfiedLinkError:
> > >
> > >
> >
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/Stri
> ng;I)Z
> > >     at 
> > > org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native
> > > Method)
> > >     at
> > >
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:42
> 3)
> > >     at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:952)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeS
> torage(Storage.java:451)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSIm
> age.java:282)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(F
> SImage.java:200)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSName
> system.java:696)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNam
> esystem.java:530)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNod
> e.java:401)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.ja
> va:435)
> > >     at
> > >
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:6
> 07)
> > >     at
> > >
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:5
> 92)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNod
> e.java:1172)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.ja
> va:895)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDF
> SCluster.java:786)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluste
> r.java:644)
> > >     at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:334)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.jav
> a:316)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.ha.TestHASafeMode.setupCluster(
> TestHASafeMode.java:87)
> > >
> > > Not sure what I missed here. Any idea what could be wrong here?
> > >
> > > Regards,
> > > Uma
> > >
> >
>


RE: mvn eclipse:eclipse failure on windows

Posted by Chuan Liu <ch...@microsoft.com>.
Hi Uma,

I suggest you do a 'mvn install -DskipTests' before running 'mvn eclipse:eclipse'.

Thanks,
Chuan

-----Original Message-----
From: Uma Maheswara Rao G [mailto:hadoop.uma@gmail.com] 
Sent: Friday, July 12, 2013 7:42 PM
To: common-dev@hadoop.apache.org
Cc: hdfs-dev@hadoop.apache.org
Subject: Re: mvn eclipse:eclipse failure on windows

HI Chris,
  eclipse:eclipse works but still I am seeing UnsatisfiesLinkError.
Explicitly I pointed java.library.path to where hadoop.dll geneated. This
dll generated with my clean install command only.   My pc is 64 but and
also set Platform=x64 while building. But does not help.

Regards,
Uma






On Fri, Jul 12, 2013 at 11:45 PM, Chris Nauroth <cn...@hortonworks.com>wrote:

> Hi Uma,
>
> I just tried getting a fresh copy of trunk and running "mvn clean 
> install -DskipTests" followed by "mvn eclipse:eclipse -DskipTests".  
> Everything worked fine in my environment.  Are you still seeing the problem?
>
> The UnsatisfiedLinkError seems to indicate that your build couldn't 
> access hadoop.dll for JNI method implementations.  hadoop.dll gets 
> built as part of the hadoop-common sub-module.  Is it possible that 
> you didn't have a complete package build for that sub-module before 
> you started running the HDFS test?
>
> Chris Nauroth
> Hortonworks
> http://hortonworks.com/
>
>
>
> On Sun, Jul 7, 2013 at 9:08 AM, sure bhands <su...@gmail.com> wrote:
>
> > I would try cleaning hadoop-maven-plugin directory from maven 
> > repository
> to
> > rule out the stale version and then mv install followed by mvn 
> > eclipse:eclipse before digging in to it further.
> >
> > Thanks,
> > Surendra
> >
> >
> > On Sun, Jul 7, 2013 at 8:28 AM, Uma Maheswara Rao G <
> hadoop.uma@gmail.com
> > >wrote:
> >
> > > Hi,
> > >
> > > I am seeing this failure on windows while executing mvn 
> > > eclipse:eclipse command on trunk.
> > >
> > > See the following trace:
> > >
> > > [INFO]
> > >
> ----------------------------------------------------------------------
> --
> > > [ERROR] Failed to execute goal
> > > org.apache.maven.plugins:maven-eclipse-plugin:2.8
> > > :eclipse (default-cli) on project hadoop-common: Request to merge 
> > > when 'filterin g' is not identical. Original=resource 
> > > src/main/resources:
> > > output=target/classes
> > > , include=[], exclude=[common-version-info.properties|**/*.java],
> > > test=false, fi
> > > ltering=false, merging with=resource src/main/resources:
> > > output=target/classes,
> > > include=[common-version-info.properties], exclude=[**/*.java],
> > test=false,
> > > filte
> > > ring=true -> [Help 1]
> > > [ERROR]
> > > [ERROR] To see the full stack trace of the errors, re-run Maven 
> > > with
> the
> > -e
> > > swit
> > > ch.
> > > [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> > > [ERROR]
> > > [ERROR] For more information about the errors and possible 
> > > solutions, please rea d the following articles:
> > > [ERROR] [Help 1]
> > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE
> > > xception
> > > [ERROR]
> > > [ERROR] After correcting the problems, you can resume the build 
> > > with
> the
> > > command
> > >
> > > [ERROR]   mvn <goals> -rf :hadoop-common
> > > E:\Hadoop-Trunk>
> > >
> > > any idea for resolving it.
> > >
> > > With 'org.apache.maven.plugins:maven-eclipse-plugin:2.6:eclipse' 
> > > seems
> to
> > > be no failures but  I am seeing following exception while running
> tests.
> > > java.lang.UnsatisfiedLinkError:
> > >
> > >
> >
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/Stri
> ng;I)Z
> > >     at 
> > > org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native
> > > Method)
> > >     at
> > >
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:42
> 3)
> > >     at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:952)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeS
> torage(Storage.java:451)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSIm
> age.java:282)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(F
> SImage.java:200)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSName
> system.java:696)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNam
> esystem.java:530)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNod
> e.java:401)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.ja
> va:435)
> > >     at
> > >
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:6
> 07)
> > >     at
> > >
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:5
> 92)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNod
> e.java:1172)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.ja
> va:895)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDF
> SCluster.java:786)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluste
> r.java:644)
> > >     at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:334)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.jav
> a:316)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.ha.TestHASafeMode.setupCluster(
> TestHASafeMode.java:87)
> > >
> > > Not sure what I missed here. Any idea what could be wrong here?
> > >
> > > Regards,
> > > Uma
> > >
> >
>


Re: mvn eclipse:eclipse failure on windows

Posted by Uma Maheswara Rao G <ha...@gmail.com>.
HI Chris,
  eclipse:eclipse works but still I am seeing UnsatisfiesLinkError.
Explicitly I pointed java.library.path to where hadoop.dll geneated. This
dll generated with my clean install command only.   My pc is 64 but and
also set Platform=x64 while building. But does not help.

Regards,
Uma






On Fri, Jul 12, 2013 at 11:45 PM, Chris Nauroth <cn...@hortonworks.com>wrote:

> Hi Uma,
>
> I just tried getting a fresh copy of trunk and running "mvn clean install
> -DskipTests" followed by "mvn eclipse:eclipse -DskipTests".  Everything
> worked fine in my environment.  Are you still seeing the problem?
>
> The UnsatisfiedLinkError seems to indicate that your build couldn't access
> hadoop.dll for JNI method implementations.  hadoop.dll gets built as part
> of the hadoop-common sub-module.  Is it possible that you didn't have a
> complete package build for that sub-module before you started running the
> HDFS test?
>
> Chris Nauroth
> Hortonworks
> http://hortonworks.com/
>
>
>
> On Sun, Jul 7, 2013 at 9:08 AM, sure bhands <su...@gmail.com> wrote:
>
> > I would try cleaning hadoop-maven-plugin directory from maven repository
> to
> > rule out the stale version and then mv install followed by mvn
> > eclipse:eclipse before digging in to it further.
> >
> > Thanks,
> > Surendra
> >
> >
> > On Sun, Jul 7, 2013 at 8:28 AM, Uma Maheswara Rao G <
> hadoop.uma@gmail.com
> > >wrote:
> >
> > > Hi,
> > >
> > > I am seeing this failure on windows while executing mvn eclipse:eclipse
> > > command on trunk.
> > >
> > > See the following trace:
> > >
> > > [INFO]
> > >
> ------------------------------------------------------------------------
> > > [ERROR] Failed to execute goal
> > > org.apache.maven.plugins:maven-eclipse-plugin:2.8
> > > :eclipse (default-cli) on project hadoop-common: Request to merge when
> > > 'filterin
> > > g' is not identical. Original=resource src/main/resources:
> > > output=target/classes
> > > , include=[], exclude=[common-version-info.properties|**/*.java],
> > > test=false, fi
> > > ltering=false, merging with=resource src/main/resources:
> > > output=target/classes,
> > > include=[common-version-info.properties], exclude=[**/*.java],
> > test=false,
> > > filte
> > > ring=true -> [Help 1]
> > > [ERROR]
> > > [ERROR] To see the full stack trace of the errors, re-run Maven with
> the
> > -e
> > > swit
> > > ch.
> > > [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> > > [ERROR]
> > > [ERROR] For more information about the errors and possible solutions,
> > > please rea
> > > d the following articles:
> > > [ERROR] [Help 1]
> > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE
> > > xception
> > > [ERROR]
> > > [ERROR] After correcting the problems, you can resume the build with
> the
> > > command
> > >
> > > [ERROR]   mvn <goals> -rf :hadoop-common
> > > E:\Hadoop-Trunk>
> > >
> > > any idea for resolving it.
> > >
> > > With 'org.apache.maven.plugins:maven-eclipse-plugin:2.6:eclipse' seems
> to
> > > be no failures but  I am seeing following exception while running
> tests.
> > > java.lang.UnsatisfiedLinkError:
> > >
> > >
> >
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
> > >     at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native
> > > Method)
> > >     at
> > >
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:423)
> > >     at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:952)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:451)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:282)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:200)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:696)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:530)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:401)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:435)
> > >     at
> > >
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:607)
> > >     at
> > >
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:592)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1172)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:895)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:786)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:644)
> > >     at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:334)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:316)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.ha.TestHASafeMode.setupCluster(TestHASafeMode.java:87)
> > >
> > > Not sure what I missed here. Any idea what could be wrong here?
> > >
> > > Regards,
> > > Uma
> > >
> >
>

Re: mvn eclipse:eclipse failure on windows

Posted by Uma Maheswara Rao G <ha...@gmail.com>.
HI Chris,
  eclipse:eclipse works but still I am seeing UnsatisfiesLinkError.
Explicitly I pointed java.library.path to where hadoop.dll geneated. This
dll generated with my clean install command only.   My pc is 64 but and
also set Platform=x64 while building. But does not help.

Regards,
Uma






On Fri, Jul 12, 2013 at 11:45 PM, Chris Nauroth <cn...@hortonworks.com>wrote:

> Hi Uma,
>
> I just tried getting a fresh copy of trunk and running "mvn clean install
> -DskipTests" followed by "mvn eclipse:eclipse -DskipTests".  Everything
> worked fine in my environment.  Are you still seeing the problem?
>
> The UnsatisfiedLinkError seems to indicate that your build couldn't access
> hadoop.dll for JNI method implementations.  hadoop.dll gets built as part
> of the hadoop-common sub-module.  Is it possible that you didn't have a
> complete package build for that sub-module before you started running the
> HDFS test?
>
> Chris Nauroth
> Hortonworks
> http://hortonworks.com/
>
>
>
> On Sun, Jul 7, 2013 at 9:08 AM, sure bhands <su...@gmail.com> wrote:
>
> > I would try cleaning hadoop-maven-plugin directory from maven repository
> to
> > rule out the stale version and then mv install followed by mvn
> > eclipse:eclipse before digging in to it further.
> >
> > Thanks,
> > Surendra
> >
> >
> > On Sun, Jul 7, 2013 at 8:28 AM, Uma Maheswara Rao G <
> hadoop.uma@gmail.com
> > >wrote:
> >
> > > Hi,
> > >
> > > I am seeing this failure on windows while executing mvn eclipse:eclipse
> > > command on trunk.
> > >
> > > See the following trace:
> > >
> > > [INFO]
> > >
> ------------------------------------------------------------------------
> > > [ERROR] Failed to execute goal
> > > org.apache.maven.plugins:maven-eclipse-plugin:2.8
> > > :eclipse (default-cli) on project hadoop-common: Request to merge when
> > > 'filterin
> > > g' is not identical. Original=resource src/main/resources:
> > > output=target/classes
> > > , include=[], exclude=[common-version-info.properties|**/*.java],
> > > test=false, fi
> > > ltering=false, merging with=resource src/main/resources:
> > > output=target/classes,
> > > include=[common-version-info.properties], exclude=[**/*.java],
> > test=false,
> > > filte
> > > ring=true -> [Help 1]
> > > [ERROR]
> > > [ERROR] To see the full stack trace of the errors, re-run Maven with
> the
> > -e
> > > swit
> > > ch.
> > > [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> > > [ERROR]
> > > [ERROR] For more information about the errors and possible solutions,
> > > please rea
> > > d the following articles:
> > > [ERROR] [Help 1]
> > > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE
> > > xception
> > > [ERROR]
> > > [ERROR] After correcting the problems, you can resume the build with
> the
> > > command
> > >
> > > [ERROR]   mvn <goals> -rf :hadoop-common
> > > E:\Hadoop-Trunk>
> > >
> > > any idea for resolving it.
> > >
> > > With 'org.apache.maven.plugins:maven-eclipse-plugin:2.6:eclipse' seems
> to
> > > be no failures but  I am seeing following exception while running
> tests.
> > > java.lang.UnsatisfiedLinkError:
> > >
> > >
> >
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
> > >     at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native
> > > Method)
> > >     at
> > >
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:423)
> > >     at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:952)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:451)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:282)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:200)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:696)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:530)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:401)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:435)
> > >     at
> > >
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:607)
> > >     at
> > >
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:592)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1172)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:895)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:786)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:644)
> > >     at
> > > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:334)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:316)
> > >     at
> > >
> > >
> >
> org.apache.hadoop.hdfs.server.namenode.ha.TestHASafeMode.setupCluster(TestHASafeMode.java:87)
> > >
> > > Not sure what I missed here. Any idea what could be wrong here?
> > >
> > > Regards,
> > > Uma
> > >
> >
>

Re: mvn eclipse:eclipse failure on windows

Posted by Chris Nauroth <cn...@hortonworks.com>.
Hi Uma,

I just tried getting a fresh copy of trunk and running "mvn clean install
-DskipTests" followed by "mvn eclipse:eclipse -DskipTests".  Everything
worked fine in my environment.  Are you still seeing the problem?

The UnsatisfiedLinkError seems to indicate that your build couldn't access
hadoop.dll for JNI method implementations.  hadoop.dll gets built as part
of the hadoop-common sub-module.  Is it possible that you didn't have a
complete package build for that sub-module before you started running the
HDFS test?

Chris Nauroth
Hortonworks
http://hortonworks.com/



On Sun, Jul 7, 2013 at 9:08 AM, sure bhands <su...@gmail.com> wrote:

> I would try cleaning hadoop-maven-plugin directory from maven repository to
> rule out the stale version and then mv install followed by mvn
> eclipse:eclipse before digging in to it further.
>
> Thanks,
> Surendra
>
>
> On Sun, Jul 7, 2013 at 8:28 AM, Uma Maheswara Rao G <hadoop.uma@gmail.com
> >wrote:
>
> > Hi,
> >
> > I am seeing this failure on windows while executing mvn eclipse:eclipse
> > command on trunk.
> >
> > See the following trace:
> >
> > [INFO]
> > ------------------------------------------------------------------------
> > [ERROR] Failed to execute goal
> > org.apache.maven.plugins:maven-eclipse-plugin:2.8
> > :eclipse (default-cli) on project hadoop-common: Request to merge when
> > 'filterin
> > g' is not identical. Original=resource src/main/resources:
> > output=target/classes
> > , include=[], exclude=[common-version-info.properties|**/*.java],
> > test=false, fi
> > ltering=false, merging with=resource src/main/resources:
> > output=target/classes,
> > include=[common-version-info.properties], exclude=[**/*.java],
> test=false,
> > filte
> > ring=true -> [Help 1]
> > [ERROR]
> > [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e
> > swit
> > ch.
> > [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> > [ERROR]
> > [ERROR] For more information about the errors and possible solutions,
> > please rea
> > d the following articles:
> > [ERROR] [Help 1]
> > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE
> > xception
> > [ERROR]
> > [ERROR] After correcting the problems, you can resume the build with the
> > command
> >
> > [ERROR]   mvn <goals> -rf :hadoop-common
> > E:\Hadoop-Trunk>
> >
> > any idea for resolving it.
> >
> > With 'org.apache.maven.plugins:maven-eclipse-plugin:2.6:eclipse' seems to
> > be no failures but  I am seeing following exception while running tests.
> > java.lang.UnsatisfiedLinkError:
> >
> >
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
> >     at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native
> > Method)
> >     at
> > org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:423)
> >     at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:952)
> >     at
> >
> >
> org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:451)
> >     at
> >
> >
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:282)
> >     at
> >
> >
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:200)
> >     at
> >
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:696)
> >     at
> >
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:530)
> >     at
> >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:401)
> >     at
> >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:435)
> >     at
> > org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:607)
> >     at
> > org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:592)
> >     at
> >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1172)
> >     at
> >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:895)
> >     at
> >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:786)
> >     at
> >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:644)
> >     at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:334)
> >     at
> >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:316)
> >     at
> >
> >
> org.apache.hadoop.hdfs.server.namenode.ha.TestHASafeMode.setupCluster(TestHASafeMode.java:87)
> >
> > Not sure what I missed here. Any idea what could be wrong here?
> >
> > Regards,
> > Uma
> >
>

Re: mvn eclipse:eclipse failure on windows

Posted by Chris Nauroth <cn...@hortonworks.com>.
Hi Uma,

I just tried getting a fresh copy of trunk and running "mvn clean install
-DskipTests" followed by "mvn eclipse:eclipse -DskipTests".  Everything
worked fine in my environment.  Are you still seeing the problem?

The UnsatisfiedLinkError seems to indicate that your build couldn't access
hadoop.dll for JNI method implementations.  hadoop.dll gets built as part
of the hadoop-common sub-module.  Is it possible that you didn't have a
complete package build for that sub-module before you started running the
HDFS test?

Chris Nauroth
Hortonworks
http://hortonworks.com/



On Sun, Jul 7, 2013 at 9:08 AM, sure bhands <su...@gmail.com> wrote:

> I would try cleaning hadoop-maven-plugin directory from maven repository to
> rule out the stale version and then mv install followed by mvn
> eclipse:eclipse before digging in to it further.
>
> Thanks,
> Surendra
>
>
> On Sun, Jul 7, 2013 at 8:28 AM, Uma Maheswara Rao G <hadoop.uma@gmail.com
> >wrote:
>
> > Hi,
> >
> > I am seeing this failure on windows while executing mvn eclipse:eclipse
> > command on trunk.
> >
> > See the following trace:
> >
> > [INFO]
> > ------------------------------------------------------------------------
> > [ERROR] Failed to execute goal
> > org.apache.maven.plugins:maven-eclipse-plugin:2.8
> > :eclipse (default-cli) on project hadoop-common: Request to merge when
> > 'filterin
> > g' is not identical. Original=resource src/main/resources:
> > output=target/classes
> > , include=[], exclude=[common-version-info.properties|**/*.java],
> > test=false, fi
> > ltering=false, merging with=resource src/main/resources:
> > output=target/classes,
> > include=[common-version-info.properties], exclude=[**/*.java],
> test=false,
> > filte
> > ring=true -> [Help 1]
> > [ERROR]
> > [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e
> > swit
> > ch.
> > [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> > [ERROR]
> > [ERROR] For more information about the errors and possible solutions,
> > please rea
> > d the following articles:
> > [ERROR] [Help 1]
> > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE
> > xception
> > [ERROR]
> > [ERROR] After correcting the problems, you can resume the build with the
> > command
> >
> > [ERROR]   mvn <goals> -rf :hadoop-common
> > E:\Hadoop-Trunk>
> >
> > any idea for resolving it.
> >
> > With 'org.apache.maven.plugins:maven-eclipse-plugin:2.6:eclipse' seems to
> > be no failures but  I am seeing following exception while running tests.
> > java.lang.UnsatisfiedLinkError:
> >
> >
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
> >     at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native
> > Method)
> >     at
> > org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:423)
> >     at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:952)
> >     at
> >
> >
> org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:451)
> >     at
> >
> >
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:282)
> >     at
> >
> >
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:200)
> >     at
> >
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:696)
> >     at
> >
> >
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:530)
> >     at
> >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:401)
> >     at
> >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:435)
> >     at
> > org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:607)
> >     at
> > org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:592)
> >     at
> >
> >
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1172)
> >     at
> >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:895)
> >     at
> >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:786)
> >     at
> >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:644)
> >     at
> > org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:334)
> >     at
> >
> >
> org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:316)
> >     at
> >
> >
> org.apache.hadoop.hdfs.server.namenode.ha.TestHASafeMode.setupCluster(TestHASafeMode.java:87)
> >
> > Not sure what I missed here. Any idea what could be wrong here?
> >
> > Regards,
> > Uma
> >
>

Re: mvn eclipse:eclipse failure on windows

Posted by sure bhands <su...@gmail.com>.
I would try cleaning hadoop-maven-plugin directory from maven repository to
rule out the stale version and then mv install followed by mvn
eclipse:eclipse before digging in to it further.

Thanks,
Surendra


On Sun, Jul 7, 2013 at 8:28 AM, Uma Maheswara Rao G <ha...@gmail.com>wrote:

> Hi,
>
> I am seeing this failure on windows while executing mvn eclipse:eclipse
> command on trunk.
>
> See the following trace:
>
> [INFO]
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal
> org.apache.maven.plugins:maven-eclipse-plugin:2.8
> :eclipse (default-cli) on project hadoop-common: Request to merge when
> 'filterin
> g' is not identical. Original=resource src/main/resources:
> output=target/classes
> , include=[], exclude=[common-version-info.properties|**/*.java],
> test=false, fi
> ltering=false, merging with=resource src/main/resources:
> output=target/classes,
> include=[common-version-info.properties], exclude=[**/*.java], test=false,
> filte
> ring=true -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e
> swit
> ch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions,
> please rea
> d the following articles:
> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE
> xception
> [ERROR]
> [ERROR] After correcting the problems, you can resume the build with the
> command
>
> [ERROR]   mvn <goals> -rf :hadoop-common
> E:\Hadoop-Trunk>
>
> any idea for resolving it.
>
> With 'org.apache.maven.plugins:maven-eclipse-plugin:2.6:eclipse' seems to
> be no failures but  I am seeing following exception while running tests.
> java.lang.UnsatisfiedLinkError:
>
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
>     at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native
> Method)
>     at
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:423)
>     at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:952)
>     at
>
> org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:451)
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:282)
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:200)
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:696)
>     at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:530)
>     at
>
> org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:401)
>     at
>
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:435)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:607)
>     at
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:592)
>     at
>
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1172)
>     at
>
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:895)
>     at
>
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:786)
>     at
>
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:644)
>     at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:334)
>     at
>
> org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:316)
>     at
>
> org.apache.hadoop.hdfs.server.namenode.ha.TestHASafeMode.setupCluster(TestHASafeMode.java:87)
>
> Not sure what I missed here. Any idea what could be wrong here?
>
> Regards,
> Uma
>