You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by yogendra reddy <yo...@gmail.com> on 2015/01/06 10:23:35 UTC

error using MiniDFSCluster in windows

Hi,

unit tests with minidfscluster are failing with this error. Looks like it
requires hadoop.dll in lib path and winutils.exe accessible through
HADOOP_HOME. Is there anything I can configure in pom.xml file so that the
library path and HADOOP_HOME requirements get injected?



java.lang.UnsatisfiedLinkError:
org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z

              at
org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)

              at
org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:570)

              at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:996)

              at
org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:484)

              at
org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:292)

              at
org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:202)

              at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:879)

              at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:638)

              at
org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:455)

              at
org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:511)

              at
org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:670)

              at
org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:655)

              at
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1304)

              at
org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:975)

              at
org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:856)

              at
org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:702)

              at
org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:374)

              at
org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:355)

              at
JobConfigurationFileReaderTest.shouldSelectFilesForThePastHour(JobConfigurationFileReaderTest.java:53)

              at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

              at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

              at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

              at java.lang.reflect.Method.invoke(Method.java:597)

              at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)

              at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)

              at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)

              at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)

              at
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)

              at
org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)

              at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)

              at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)

              at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)

              at
org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)

              at
org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)

              at
org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)

              at
org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)

              at org.junit.runners.ParentRunner.run(ParentRunner.java:363)

              at
org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53)

              at
org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:123)

              at
org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:104)

              at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

              at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

              at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

              at java.lang.reflect.Method.invoke(Method.java:597)

              at
org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:164)

              at
org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:110)

              at
org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:175)

              at
org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWhenForked(SurefireStarter.java:107)

              at
org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:68)


Thanks,

Yogendra

Re: error using MiniDFSCluster in windows

Posted by Chris Nauroth <cn...@hortonworks.com>.
Hi Yogendra,

This is something that I had fixed as part of HDFS-573.

https://issues.apache.org/jira/browse/HDFS-573

The relevant portion of hadoop-hdfs-project/hadoop-hdfs/pom.xml is here,
where we set up HADOOP_HOME and PATH to reference hadoop.dll and
winutils.exe in the hadoop-common-project/hadoop-common build:

                    <macrodef name="run-test">
                      <attribute name="test"/>
                      <sequential>
                        <echo message="Running @{test}"/>
                        <exec
executable="${project.build.directory}/native/RelWithDebInfo/@{test}"
failonerror="true" dir="${project.build.directory}/native/">
                          <env key="CLASSPATH"
value="${test_classpath}:${compile_classpath}"/>
                          <!-- HADOOP_HOME required to find winutils. -->
                          <env key="HADOOP_HOME"
value="${hadoop.common.build.dir}"/>
                          <!-- Make sure hadoop.dll and jvm.dll are on
PATH. -->
                          <env key="PATH"
value="${env.PATH};${hadoop.common.build.dir}/bin;${java.home}/jre/bin/server;${java.home}/bin/server"/>
                        </exec>
                        <echo message="Finished @{test}"/>
                      </sequential>
                    </macrodef>

This means if you have a built hadoop.dll and winutils.exe in your
hadoop-common module, then the libhdfs tests in the hadoop-hdfs module will
find it there.  I expect you can resolve the problem by either rebuilding
the hadoop-common module and then rerunning the hadoop-hdfs build, or just
build the whole project from the root of the source tree.

Chris Nauroth
Hortonworks
http://hortonworks.com/


On Tue, Jan 6, 2015 at 1:23 AM, yogendra reddy <yo...@gmail.com>
wrote:

> Hi,
>
> unit tests with minidfscluster are failing with this error. Looks like it
> requires hadoop.dll in lib path and winutils.exe accessible through
> HADOOP_HOME. Is there anything I can configure in pom.xml file so that the
> library path and HADOOP_HOME requirements get injected?
>
>
>
> java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
>
>               at
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
>
>               at
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:570)
>
>               at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:996)
>
>               at
> org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:484)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:292)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:202)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:879)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:638)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:455)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:511)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:670)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:655)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1304)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:975)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:856)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:702)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:374)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:355)
>
>               at
> JobConfigurationFileReaderTest.shouldSelectFilesForThePastHour(JobConfigurationFileReaderTest.java:53)
>
>               at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>
>               at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>
>               at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>
>               at java.lang.reflect.Method.invoke(Method.java:597)
>
>               at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
>
>               at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>
>               at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
>
>               at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>
>               at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>
>               at
> org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
>
>               at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
>
>               at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
>
>               at
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
>
>               at
> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
>
>               at
> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
>
>               at
> org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
>
>               at
> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
>
>               at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
>
>               at
> org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53)
>
>               at
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:123)
>
>               at
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:104)
>
>               at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>
>               at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>
>               at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>
>               at java.lang.reflect.Method.invoke(Method.java:597)
>
>               at
> org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:164)
>
>               at
> org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:110)
>
>               at
> org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:175)
>
>               at
> org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWhenForked(SurefireStarter.java:107)
>
>               at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:68)
>
>
> Thanks,
>
> Yogendra
>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: error using MiniDFSCluster in windows

Posted by Chris Nauroth <cn...@hortonworks.com>.
Hi Yogendra,

This is something that I had fixed as part of HDFS-573.

https://issues.apache.org/jira/browse/HDFS-573

The relevant portion of hadoop-hdfs-project/hadoop-hdfs/pom.xml is here,
where we set up HADOOP_HOME and PATH to reference hadoop.dll and
winutils.exe in the hadoop-common-project/hadoop-common build:

                    <macrodef name="run-test">
                      <attribute name="test"/>
                      <sequential>
                        <echo message="Running @{test}"/>
                        <exec
executable="${project.build.directory}/native/RelWithDebInfo/@{test}"
failonerror="true" dir="${project.build.directory}/native/">
                          <env key="CLASSPATH"
value="${test_classpath}:${compile_classpath}"/>
                          <!-- HADOOP_HOME required to find winutils. -->
                          <env key="HADOOP_HOME"
value="${hadoop.common.build.dir}"/>
                          <!-- Make sure hadoop.dll and jvm.dll are on
PATH. -->
                          <env key="PATH"
value="${env.PATH};${hadoop.common.build.dir}/bin;${java.home}/jre/bin/server;${java.home}/bin/server"/>
                        </exec>
                        <echo message="Finished @{test}"/>
                      </sequential>
                    </macrodef>

This means if you have a built hadoop.dll and winutils.exe in your
hadoop-common module, then the libhdfs tests in the hadoop-hdfs module will
find it there.  I expect you can resolve the problem by either rebuilding
the hadoop-common module and then rerunning the hadoop-hdfs build, or just
build the whole project from the root of the source tree.

Chris Nauroth
Hortonworks
http://hortonworks.com/


On Tue, Jan 6, 2015 at 1:23 AM, yogendra reddy <yo...@gmail.com>
wrote:

> Hi,
>
> unit tests with minidfscluster are failing with this error. Looks like it
> requires hadoop.dll in lib path and winutils.exe accessible through
> HADOOP_HOME. Is there anything I can configure in pom.xml file so that the
> library path and HADOOP_HOME requirements get injected?
>
>
>
> java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
>
>               at
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
>
>               at
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:570)
>
>               at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:996)
>
>               at
> org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:484)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:292)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:202)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:879)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:638)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:455)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:511)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:670)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:655)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1304)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:975)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:856)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:702)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:374)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:355)
>
>               at
> JobConfigurationFileReaderTest.shouldSelectFilesForThePastHour(JobConfigurationFileReaderTest.java:53)
>
>               at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>
>               at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>
>               at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>
>               at java.lang.reflect.Method.invoke(Method.java:597)
>
>               at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
>
>               at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>
>               at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
>
>               at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>
>               at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>
>               at
> org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
>
>               at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
>
>               at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
>
>               at
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
>
>               at
> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
>
>               at
> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
>
>               at
> org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
>
>               at
> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
>
>               at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
>
>               at
> org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53)
>
>               at
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:123)
>
>               at
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:104)
>
>               at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>
>               at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>
>               at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>
>               at java.lang.reflect.Method.invoke(Method.java:597)
>
>               at
> org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:164)
>
>               at
> org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:110)
>
>               at
> org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:175)
>
>               at
> org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWhenForked(SurefireStarter.java:107)
>
>               at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:68)
>
>
> Thanks,
>
> Yogendra
>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: error using MiniDFSCluster in windows

Posted by Chris Nauroth <cn...@hortonworks.com>.
Hi Yogendra,

This is something that I had fixed as part of HDFS-573.

https://issues.apache.org/jira/browse/HDFS-573

The relevant portion of hadoop-hdfs-project/hadoop-hdfs/pom.xml is here,
where we set up HADOOP_HOME and PATH to reference hadoop.dll and
winutils.exe in the hadoop-common-project/hadoop-common build:

                    <macrodef name="run-test">
                      <attribute name="test"/>
                      <sequential>
                        <echo message="Running @{test}"/>
                        <exec
executable="${project.build.directory}/native/RelWithDebInfo/@{test}"
failonerror="true" dir="${project.build.directory}/native/">
                          <env key="CLASSPATH"
value="${test_classpath}:${compile_classpath}"/>
                          <!-- HADOOP_HOME required to find winutils. -->
                          <env key="HADOOP_HOME"
value="${hadoop.common.build.dir}"/>
                          <!-- Make sure hadoop.dll and jvm.dll are on
PATH. -->
                          <env key="PATH"
value="${env.PATH};${hadoop.common.build.dir}/bin;${java.home}/jre/bin/server;${java.home}/bin/server"/>
                        </exec>
                        <echo message="Finished @{test}"/>
                      </sequential>
                    </macrodef>

This means if you have a built hadoop.dll and winutils.exe in your
hadoop-common module, then the libhdfs tests in the hadoop-hdfs module will
find it there.  I expect you can resolve the problem by either rebuilding
the hadoop-common module and then rerunning the hadoop-hdfs build, or just
build the whole project from the root of the source tree.

Chris Nauroth
Hortonworks
http://hortonworks.com/


On Tue, Jan 6, 2015 at 1:23 AM, yogendra reddy <yo...@gmail.com>
wrote:

> Hi,
>
> unit tests with minidfscluster are failing with this error. Looks like it
> requires hadoop.dll in lib path and winutils.exe accessible through
> HADOOP_HOME. Is there anything I can configure in pom.xml file so that the
> library path and HADOOP_HOME requirements get injected?
>
>
>
> java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
>
>               at
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
>
>               at
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:570)
>
>               at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:996)
>
>               at
> org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:484)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:292)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:202)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:879)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:638)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:455)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:511)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:670)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:655)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1304)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:975)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:856)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:702)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:374)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:355)
>
>               at
> JobConfigurationFileReaderTest.shouldSelectFilesForThePastHour(JobConfigurationFileReaderTest.java:53)
>
>               at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>
>               at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>
>               at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>
>               at java.lang.reflect.Method.invoke(Method.java:597)
>
>               at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
>
>               at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>
>               at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
>
>               at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>
>               at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>
>               at
> org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
>
>               at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
>
>               at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
>
>               at
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
>
>               at
> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
>
>               at
> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
>
>               at
> org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
>
>               at
> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
>
>               at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
>
>               at
> org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53)
>
>               at
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:123)
>
>               at
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:104)
>
>               at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>
>               at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>
>               at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>
>               at java.lang.reflect.Method.invoke(Method.java:597)
>
>               at
> org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:164)
>
>               at
> org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:110)
>
>               at
> org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:175)
>
>               at
> org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWhenForked(SurefireStarter.java:107)
>
>               at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:68)
>
>
> Thanks,
>
> Yogendra
>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: error using MiniDFSCluster in windows

Posted by Chris Nauroth <cn...@hortonworks.com>.
Hi Yogendra,

This is something that I had fixed as part of HDFS-573.

https://issues.apache.org/jira/browse/HDFS-573

The relevant portion of hadoop-hdfs-project/hadoop-hdfs/pom.xml is here,
where we set up HADOOP_HOME and PATH to reference hadoop.dll and
winutils.exe in the hadoop-common-project/hadoop-common build:

                    <macrodef name="run-test">
                      <attribute name="test"/>
                      <sequential>
                        <echo message="Running @{test}"/>
                        <exec
executable="${project.build.directory}/native/RelWithDebInfo/@{test}"
failonerror="true" dir="${project.build.directory}/native/">
                          <env key="CLASSPATH"
value="${test_classpath}:${compile_classpath}"/>
                          <!-- HADOOP_HOME required to find winutils. -->
                          <env key="HADOOP_HOME"
value="${hadoop.common.build.dir}"/>
                          <!-- Make sure hadoop.dll and jvm.dll are on
PATH. -->
                          <env key="PATH"
value="${env.PATH};${hadoop.common.build.dir}/bin;${java.home}/jre/bin/server;${java.home}/bin/server"/>
                        </exec>
                        <echo message="Finished @{test}"/>
                      </sequential>
                    </macrodef>

This means if you have a built hadoop.dll and winutils.exe in your
hadoop-common module, then the libhdfs tests in the hadoop-hdfs module will
find it there.  I expect you can resolve the problem by either rebuilding
the hadoop-common module and then rerunning the hadoop-hdfs build, or just
build the whole project from the root of the source tree.

Chris Nauroth
Hortonworks
http://hortonworks.com/


On Tue, Jan 6, 2015 at 1:23 AM, yogendra reddy <yo...@gmail.com>
wrote:

> Hi,
>
> unit tests with minidfscluster are failing with this error. Looks like it
> requires hadoop.dll in lib path and winutils.exe accessible through
> HADOOP_HOME. Is there anything I can configure in pom.xml file so that the
> library path and HADOOP_HOME requirements get injected?
>
>
>
> java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
>
>               at
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
>
>               at
> org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:570)
>
>               at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:996)
>
>               at
> org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyzeStorage(Storage.java:484)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSImage.java:292)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:202)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNamesystem.java:879)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFromDisk(FSNamesystem.java:638)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.loadNamesystem(NameNode.java:455)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:511)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:670)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:655)
>
>               at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1304)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:975)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:856)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:702)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:374)
>
>               at
> org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:355)
>
>               at
> JobConfigurationFileReaderTest.shouldSelectFilesForThePastHour(JobConfigurationFileReaderTest.java:53)
>
>               at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>
>               at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>
>               at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>
>               at java.lang.reflect.Method.invoke(Method.java:597)
>
>               at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
>
>               at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>
>               at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
>
>               at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>
>               at
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>
>               at
> org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
>
>               at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
>
>               at
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
>
>               at
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
>
>               at
> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
>
>               at
> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
>
>               at
> org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
>
>               at
> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
>
>               at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
>
>               at
> org.apache.maven.surefire.junit4.JUnit4TestSet.execute(JUnit4TestSet.java:53)
>
>               at
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:123)
>
>               at
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:104)
>
>               at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
>
>               at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>
>               at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>
>               at java.lang.reflect.Method.invoke(Method.java:597)
>
>               at
> org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:164)
>
>               at
> org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:110)
>
>               at
> org.apache.maven.surefire.booter.SurefireStarter.invokeProvider(SurefireStarter.java:175)
>
>               at
> org.apache.maven.surefire.booter.SurefireStarter.runSuitesInProcessWhenForked(SurefireStarter.java:107)
>
>               at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:68)
>
>
> Thanks,
>
> Yogendra
>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.