You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "Dian Fu (JIRA)" <ji...@apache.org> on 2014/11/24 06:13:12 UTC

[jira] [Updated] (HADOOP-11329) should add HADOOP_HOME as part of kms's startup options

     [ https://issues.apache.org/jira/browse/HADOOP-11329?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dian Fu updated HADOOP-11329:
-----------------------------
    Description: 
Currently, HADOOP_HOME isn't part of the start up options of KMS. If I add the the following configuration to core-site.xml of kms,
{code} <property>
  <name>hadoop.security.crypto.codec.classes.aes.ctr.nopadding</name>
  <value>org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec</value>
 </property>
{code} kms server will throw the following exception when receive "generateEncryptedKey" request
{code}
2014-11-24 10:23:18,189 DEBUG org.apache.hadoop.crypto.OpensslCipher: Failed to load OpenSSL Cipher.
java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl()Z
        at org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl(Native Method)
        at org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:85)
        at org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129)
        at org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:67)
        at org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:100)
        at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension$DefaultCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:256)
        at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:371)
        at org.apache.hadoop.crypto.key.kms.server.EagerKeyGeneratorKeyProviderCryptoExtension$CryptoExtension$EncryptedQueueRefiller.fillQueueForKey(EagerKeyGeneratorKeyProviderCryptoExtension.java:77)
        at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:181)
        at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:175)
        at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568)
        at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)
        at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313)
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)
        at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3969)
        at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4829)
        at org.apache.hadoop.crypto.key.kms.ValueQueue.getAtMost(ValueQueue.java:256)
        at org.apache.hadoop.crypto.key.kms.ValueQueue.getNext(ValueQueue.java:226)
        at org.apache.hadoop.crypto.key.kms.server.EagerKeyGeneratorKeyProviderCryptoExtension$CryptoExtension.generateEncryptedKey(EagerKeyGeneratorKeyProviderCryptoExtension.java:126)
        at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:371)
        at org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider.generateEncryptedKey(KeyAuthorizationKeyProvider.java:192)
        at org.apache.hadoop.crypto.key.kms.server.KMS$9.run(KMS.java:379)
        at org.apache.hadoop.crypto.key.kms.server.KMS$9.run(KMS.java:375
{code}
The reason is that it cannot find libhadoop.so.

  was:
Currently, HADOOP_HOME isn't part of the start up options of KMS. If I add the the following configuration to core-site.xml of kms,
{code} 
<property>
  <name>hadoop.security.crypto.codec.classes.aes.ctr.nopadding</name>
  <value>org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec</value>
 </property>
{code} kms server will throw the following exception when receive "generateEncryptedKey" request
{code}
2014-11-24 10:23:18,189 DEBUG org.apache.hadoop.crypto.OpensslCipher: Failed to load OpenSSL Cipher.
java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl()Z
        at org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl(Native Method)
        at org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:85)
        at org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129)
        at org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:67)
        at org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:100)
        at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension$DefaultCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:256)
        at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:371)
        at org.apache.hadoop.crypto.key.kms.server.EagerKeyGeneratorKeyProviderCryptoExtension$CryptoExtension$EncryptedQueueRefiller.fillQueueForKey(EagerKeyGeneratorKeyProviderCryptoExtension.java:77)
        at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:181)
        at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:175)
        at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568)
        at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)
        at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313)
        at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)
        at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
        at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3969)
        at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4829)
        at org.apache.hadoop.crypto.key.kms.ValueQueue.getAtMost(ValueQueue.java:256)
        at org.apache.hadoop.crypto.key.kms.ValueQueue.getNext(ValueQueue.java:226)
        at org.apache.hadoop.crypto.key.kms.server.EagerKeyGeneratorKeyProviderCryptoExtension$CryptoExtension.generateEncryptedKey(EagerKeyGeneratorKeyProviderCryptoExtension.java:126)
        at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:371)
        at org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider.generateEncryptedKey(KeyAuthorizationKeyProvider.java:192)
        at org.apache.hadoop.crypto.key.kms.server.KMS$9.run(KMS.java:379)
        at org.apache.hadoop.crypto.key.kms.server.KMS$9.run(KMS.java:375
{code}
The reason is that it cannot find libhadoop.so.


> should add HADOOP_HOME as part of kms's startup options
> -------------------------------------------------------
>
>                 Key: HADOOP-11329
>                 URL: https://issues.apache.org/jira/browse/HADOOP-11329
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: kms, security
>            Reporter: Dian Fu
>
> Currently, HADOOP_HOME isn't part of the start up options of KMS. If I add the the following configuration to core-site.xml of kms,
> {code} <property>
>   <name>hadoop.security.crypto.codec.classes.aes.ctr.nopadding</name>
>   <value>org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec</value>
>  </property>
> {code} kms server will throw the following exception when receive "generateEncryptedKey" request
> {code}
> 2014-11-24 10:23:18,189 DEBUG org.apache.hadoop.crypto.OpensslCipher: Failed to load OpenSSL Cipher.
> java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl()Z
>         at org.apache.hadoop.util.NativeCodeLoader.buildSupportsOpenssl(Native Method)
>         at org.apache.hadoop.crypto.OpensslCipher.<clinit>(OpensslCipher.java:85)
>         at org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec.<init>(OpensslAesCtrCryptoCodec.java:50)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>         at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129)
>         at org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:67)
>         at org.apache.hadoop.crypto.CryptoCodec.getInstance(CryptoCodec.java:100)
>         at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension$DefaultCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:256)
>         at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:371)
>         at org.apache.hadoop.crypto.key.kms.server.EagerKeyGeneratorKeyProviderCryptoExtension$CryptoExtension$EncryptedQueueRefiller.fillQueueForKey(EagerKeyGeneratorKeyProviderCryptoExtension.java:77)
>         at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:181)
>         at org.apache.hadoop.crypto.key.kms.ValueQueue$1.load(ValueQueue.java:175)
>         at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568)
>         at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)
>         at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313)
>         at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)
>         at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
>         at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3969)
>         at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4829)
>         at org.apache.hadoop.crypto.key.kms.ValueQueue.getAtMost(ValueQueue.java:256)
>         at org.apache.hadoop.crypto.key.kms.ValueQueue.getNext(ValueQueue.java:226)
>         at org.apache.hadoop.crypto.key.kms.server.EagerKeyGeneratorKeyProviderCryptoExtension$CryptoExtension.generateEncryptedKey(EagerKeyGeneratorKeyProviderCryptoExtension.java:126)
>         at org.apache.hadoop.crypto.key.KeyProviderCryptoExtension.generateEncryptedKey(KeyProviderCryptoExtension.java:371)
>         at org.apache.hadoop.crypto.key.kms.server.KeyAuthorizationKeyProvider.generateEncryptedKey(KeyAuthorizationKeyProvider.java:192)
>         at org.apache.hadoop.crypto.key.kms.server.KMS$9.run(KMS.java:379)
>         at org.apache.hadoop.crypto.key.kms.server.KMS$9.run(KMS.java:375
> {code}
> The reason is that it cannot find libhadoop.so.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)