You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Colin Patrick McCabe (JIRA)" <ji...@apache.org> on 2014/07/22 02:58:40 UTC

[jira] [Resolved] (HADOOP-10870) Failed to load OpenSSL cipher error logs on systems with old openssl versions

     [ https://issues.apache.org/jira/browse/HADOOP-10870?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Colin Patrick McCabe resolved HADOOP-10870.
-------------------------------------------

          Resolution: Fixed
       Fix Version/s: fs-encryption (HADOOP-10150 and HDFS-6134)
    Target Version/s: fs-encryption (HADOOP-10150 and HDFS-6134)

> Failed to load OpenSSL cipher error logs on systems with old openssl versions
> -----------------------------------------------------------------------------
>
>                 Key: HADOOP-10870
>                 URL: https://issues.apache.org/jira/browse/HADOOP-10870
>             Project: Hadoop Common
>          Issue Type: Sub-task
>          Components: security
>    Affects Versions: fs-encryption (HADOOP-10150 and HDFS-6134)
>            Reporter: Stephen Chu
>            Assignee: Colin Patrick McCabe
>             Fix For: fs-encryption (HADOOP-10150 and HDFS-6134)
>
>         Attachments: HADOOP-10870-fs-enc.001.patch
>
>
> I built Hadoop from fs-encryption branch and deployed Hadoop (without enabling any security confs) on a Centos 6.4 VM with an old version of openssl.
> {code}
> [root@schu-enc hadoop-common]# rpm -qa | grep openssl
> openssl-1.0.0-27.el6_4.2.x86_64
> openssl-devel-1.0.0-27.el6_4.2.x86_64
> {code}
> When I try to do a simple "hadoop fs -ls", I get
> {code}
> [hdfs@schu-enc hadoop-common]$ hadoop fs -ls
> 2014-07-21 19:35:14,486 ERROR [main] crypto.OpensslCipher (OpensslCipher.java:<clinit>(87)) - Failed to load OpenSSL Cipher.
> java.lang.UnsatisfiedLinkError: Cannot find AES-CTR support, is your version of Openssl new enough?
> 	at org.apache.hadoop.crypto.OpensslCipher.initIDs(Native Method)
> 	at org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:84)
> 	at org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> 	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129)
> 	at org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:55)
> 	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:591)
> 	at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:561)
> 	at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:139)
> 	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2590)
> 	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
> 	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2624)
> 	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2606)
> 	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
> 	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:167)
> 	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:352)
> 	at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
> 	at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:325)
> 	at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:228)
> 	at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:211)
> 	at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:194)
> 	at org.apache.hadoop.fs.shell.Command.run(Command.java:155)
> 	at org.apache.hadoop.fs.FsShell.run(FsShell.java:287)
> 	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> 	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
> 	at org.apache.hadoop.fs.FsShell.main(FsShell.java:340)
> 2014-07-21 19:35:14,495 WARN  [main] crypto.CryptoCodec (CryptoCodec.java:getInstance(66)) - Crypto codec org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec is not available.
> {code}
> It would be an improvment to clean up/shorten this error log.
> hadoop checknative shows the error as well
> {code}
> [hdfs@schu-enc ~]$ hadoop checknative
> 2014-07-21 19:38:38,376 INFO  [main] bzip2.Bzip2Factory (Bzip2Factory.java:isNativeBzip2Loaded(70)) - Successfully loaded & initialized native-bzip2 library system-native
> 2014-07-21 19:38:38,395 INFO  [main] zlib.ZlibFactory (ZlibFactory.java:<clinit>(49)) - Successfully loaded & initialized native-zlib library
> 2014-07-21 19:38:38,411 ERROR [main] crypto.OpensslCipher (OpensslCipher.java:<clinit>(87)) - Failed to load OpenSSL Cipher.
> java.lang.UnsatisfiedLinkError: Cannot find AES-CTR support, is your version of Openssl new enough?
> 	at org.apache.hadoop.crypto.OpensslCipher.initIDs(Native Method)
> 	at org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:84)
> 	at org.apache.hadoop.util.NativeLibraryChecker.main(NativeLibraryChecker.java:82)
> Native library checking:
> hadoop:  true /home/hdfs/hadoop-3.0.0-SNAPSHOT/lib/native/libhadoop.so.1.0.0
> zlib:    true /lib64/libz.so.1
> snappy:  true /usr/lib64/libsnappy.so.1
> lz4:     true revision:99
> bzip2:   true /lib64/libbz2.so.1
> openssl: false 
> {code}
> Thanks to cmccabe who identified this issue as a bug.



--
This message was sent by Atlassian JIRA
(v6.2#6252)