You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "Andrew Klochkov (JIRA)" <ji...@apache.org> on 2010/04/05 16:59:27 UTC
[jira] Created: (HADOOP-6681) Fill AWS credentials when configuring
Hadoop on EC2 instances
Fill AWS credentials when configuring Hadoop on EC2 instances
-------------------------------------------------------------
Key: HADOOP-6681
URL: https://issues.apache.org/jira/browse/HADOOP-6681
Project: Hadoop Common
Issue Type: Improvement
Components: contrib/cloud
Reporter: Andrew Klochkov
Attachments: HADOOP-6681.patch
There's a function "configure_hadoop" in the hadoop-ec2-init-remote.sh script used to configure EC2 nodes for Hadoop. The function actually uses AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables, but they are never passed to it. It can be fixed in service.py by passing those variables.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
[jira] Updated: (HADOOP-6681) Fill AWS credentials when configuring
Hadoop on EC2 instances
Posted by "Andrew Klochkov (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/HADOOP-6681?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Andrew Klochkov updated HADOOP-6681:
------------------------------------
Attachment: HADOOP-6681.patch
> Fill AWS credentials when configuring Hadoop on EC2 instances
> -------------------------------------------------------------
>
> Key: HADOOP-6681
> URL: https://issues.apache.org/jira/browse/HADOOP-6681
> Project: Hadoop Common
> Issue Type: Improvement
> Components: contrib/cloud
> Reporter: Andrew Klochkov
> Attachments: HADOOP-6681.patch
>
>
> There's a function "configure_hadoop" in the hadoop-ec2-init-remote.sh script used to configure EC2 nodes for Hadoop. The function actually uses AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables, but they are never passed to it. It can be fixed in service.py by passing those variables.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
[jira] Commented: (HADOOP-6681) Fill AWS credentials when
configuring Hadoop on EC2 instances
Posted by "Hadoop QA (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/HADOOP-6681?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12853518#action_12853518 ]
Hadoop QA commented on HADOOP-6681:
-----------------------------------
-1 overall. Here are the results of testing the latest attachment
http://issues.apache.org/jira/secure/attachment/12440759/HADOOP-6681.patch
against trunk revision 930096.
+1 @author. The patch does not contain any @author tags.
-1 tests included. The patch doesn't appear to include any new or modified tests.
Please justify why no new tests are needed for this patch.
Also please list what manual steps were performed to verify this patch.
+1 javadoc. The javadoc tool did not generate any warning messages.
+1 javac. The applied patch does not increase the total number of javac compiler warnings.
+1 findbugs. The patch does not introduce any new Findbugs warnings.
+1 release audit. The applied patch does not increase the total number of release audit warnings.
+1 core tests. The patch passed core unit tests.
+1 contrib tests. The patch passed contrib unit tests.
Test results: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch-h4.grid.sp2.yahoo.net/441/testReport/
Findbugs warnings: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch-h4.grid.sp2.yahoo.net/441/artifact/trunk/build/test/findbugs/newPatchFindbugsWarnings.html
Checkstyle results: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch-h4.grid.sp2.yahoo.net/441/artifact/trunk/build/test/checkstyle-errors.html
Console output: http://hudson.zones.apache.org/hudson/job/Hadoop-Patch-h4.grid.sp2.yahoo.net/441/console
This message is automatically generated.
> Fill AWS credentials when configuring Hadoop on EC2 instances
> -------------------------------------------------------------
>
> Key: HADOOP-6681
> URL: https://issues.apache.org/jira/browse/HADOOP-6681
> Project: Hadoop Common
> Issue Type: Improvement
> Components: contrib/cloud
> Reporter: Andrew Klochkov
> Attachments: HADOOP-6681.patch
>
>
> There's a function "configure_hadoop" in the hadoop-ec2-init-remote.sh script used to configure EC2 nodes for Hadoop. The function actually uses AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables, but they are never passed to it. It can be fixed in service.py by passing those variables.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
[jira] Commented: (HADOOP-6681) Fill AWS credentials when
configuring Hadoop on EC2 instances
Posted by "Tom White (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/HADOOP-6681?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12854098#action_12854098 ]
Tom White commented on HADOOP-6681:
-----------------------------------
Passing these credentials automatically may not be what the user wants, and may in fact be a security liability (since the credentials are passed to the cluster, which may be shared with other users). You can achieve the same result explicitly with the following command line arguments:
{code}
--env AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID \
--env AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY
{code}
Does this solve your issue?
> Fill AWS credentials when configuring Hadoop on EC2 instances
> -------------------------------------------------------------
>
> Key: HADOOP-6681
> URL: https://issues.apache.org/jira/browse/HADOOP-6681
> Project: Hadoop Common
> Issue Type: Improvement
> Components: contrib/cloud
> Reporter: Andrew Klochkov
> Attachments: HADOOP-6681.patch
>
>
> There's a function "configure_hadoop" in the hadoop-ec2-init-remote.sh script used to configure EC2 nodes for Hadoop. The function actually uses AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables, but they are never passed to it. It can be fixed in service.py by passing those variables.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
[jira] Updated: (HADOOP-6681) Fill AWS credentials when configuring
Hadoop on EC2 instances
Posted by "Tom White (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/HADOOP-6681?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Tom White updated HADOOP-6681:
------------------------------
Resolution: Won't Fix
Status: Resolved (was: Patch Available)
Thanks Andrew.
> Fill AWS credentials when configuring Hadoop on EC2 instances
> -------------------------------------------------------------
>
> Key: HADOOP-6681
> URL: https://issues.apache.org/jira/browse/HADOOP-6681
> Project: Hadoop Common
> Issue Type: Improvement
> Components: contrib/cloud
> Reporter: Andrew Klochkov
> Attachments: HADOOP-6681.patch
>
>
> There's a function "configure_hadoop" in the hadoop-ec2-init-remote.sh script used to configure EC2 nodes for Hadoop. The function actually uses AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables, but they are never passed to it. It can be fixed in service.py by passing those variables.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
[jira] Updated: (HADOOP-6681) Fill AWS credentials when configuring
Hadoop on EC2 instances
Posted by "Andrew Klochkov (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/HADOOP-6681?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Andrew Klochkov updated HADOOP-6681:
------------------------------------
Release Note: Fill AWS credentials when configuring Hadoop EC2 instances
Status: Patch Available (was: Open)
> Fill AWS credentials when configuring Hadoop on EC2 instances
> -------------------------------------------------------------
>
> Key: HADOOP-6681
> URL: https://issues.apache.org/jira/browse/HADOOP-6681
> Project: Hadoop Common
> Issue Type: Improvement
> Components: contrib/cloud
> Reporter: Andrew Klochkov
> Attachments: HADOOP-6681.patch
>
>
> There's a function "configure_hadoop" in the hadoop-ec2-init-remote.sh script used to configure EC2 nodes for Hadoop. The function actually uses AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables, but they are never passed to it. It can be fixed in service.py by passing those variables.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
[jira] Commented: (HADOOP-6681) Fill AWS credentials when
configuring Hadoop on EC2 instances
Posted by "Andrew Klochkov (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/HADOOP-6681?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12855356#action_12855356 ]
Andrew Klochkov commented on HADOOP-6681:
-----------------------------------------
Yes, it does solve the issue.
> Fill AWS credentials when configuring Hadoop on EC2 instances
> -------------------------------------------------------------
>
> Key: HADOOP-6681
> URL: https://issues.apache.org/jira/browse/HADOOP-6681
> Project: Hadoop Common
> Issue Type: Improvement
> Components: contrib/cloud
> Reporter: Andrew Klochkov
> Attachments: HADOOP-6681.patch
>
>
> There's a function "configure_hadoop" in the hadoop-ec2-init-remote.sh script used to configure EC2 nodes for Hadoop. The function actually uses AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables, but they are never passed to it. It can be fixed in service.py by passing those variables.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.