You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@nutch.apache.org by "Rustam Abdullaev (JIRA)" <ji...@apache.org> on 2018/09/04 11:05:00 UTC

[jira] [Updated] (NUTCH-2639) java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0

     [ https://issues.apache.org/jira/browse/NUTCH-2639?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Rustam Abdullaev updated NUTCH-2639:
------------------------------------
    Affects Version/s: 2.4

> java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0
> --------------------------------------------------------------------------------------
>
>                 Key: NUTCH-2639
>                 URL: https://issues.apache.org/jira/browse/NUTCH-2639
>             Project: Nutch
>          Issue Type: Bug
>    Affects Versions: 2.4
>         Environment: Windows 10 x64 Cygwin
>            Reporter: Rustam Abdullaev
>            Priority: Major
>
> It is impossible to run nutch under Cygwin, even when hadoop.dll is properly available in the PATH.
> The issue is two-fold:
> 1. JAVA_PLATFORM detects as "{{Windows_NT-amd64-64\r}}" (notice the trailing \r)
> 2. A non-existent directory {{lib/native/Windows_NT-amd64-64\r}} is being set as {{java.library.path}}, making Java ignore the system PATH when looking for hadoop.dll.
>  
> As a result, the following exception is thrown on any Gora backend access:
> {noformat}
> java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
>         at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
>         at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:570)
>         at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977)
>         at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:173)
>         at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:160)
>         at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:94)
>         at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:285)
>         at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:344)
>         at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:150)
>         at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:131)
>         at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:115)
>         at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:131)
>         at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:163)
>         at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:731)
>         at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:432)
>         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
>         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
>         at org.apache.nutch.util.NutchJob.waitForCompletion(NutchJob.java:115)
>         at org.apache.nutch.crawl.InjectorJob.run(InjectorJob.java:249)
>         at org.apache.nutch.crawl.InjectorJob.inject(InjectorJob.java:270)
>         at org.apache.nutch.crawl.InjectorJob.run(InjectorJob.java:293)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>         at org.apache.nutch.crawl.InjectorJob.main(InjectorJob.java:302)
> {noformat}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)