You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shivaram Venkataraman (JIRA)" <ji...@apache.org> on 2016/01/30 01:37:39 UTC
[jira] [Resolved] (SPARK-9688) Improve spark-ec2 script to handle
users that are not root
[ https://issues.apache.org/jira/browse/SPARK-9688?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Shivaram Venkataraman resolved SPARK-9688.
------------------------------------------
Resolution: Won't Fix
Now tracked at https://github.com/amplab/spark-ec2/issues/1
> Improve spark-ec2 script to handle users that are not root
> ----------------------------------------------------------
>
> Key: SPARK-9688
> URL: https://issues.apache.org/jira/browse/SPARK-9688
> Project: Spark
> Issue Type: Improvement
> Components: EC2
> Affects Versions: 1.4.0, 1.4.1
> Environment: All
> Reporter: Karina Uribe
> Labels: EC2, aws-ec2, security
> Original Estimate: 252h
> Remaining Estimate: 252h
>
> Hi,
> I was trying to use the spark-ec2 script from Spark to create a new Spark cluster wit an user different than root (--user=ec2-user). Unfortunately the part of the script that attempts to copy the templates into the target machines fail because it tries to rsync /etc/* and /root/*
> This is the full traceback
> rsync: recv_generator: mkdir "/root/spark-ec2" failed: Permission denied (13)
> *** Skipping any contents from this failed directory ***
> sent 95 bytes received 17 bytes 224.00 bytes/sec
> total size is 1444 speedup is 12.89
> rsync error: some files/attrs were not transferred (see previous errors) (code 2 3) at main.c(1039) [sender=3.0.6]
> Traceback (most recent call last):
> File "/home/ec2-user/spark-1.4.0/ec2/spark_ec2.py", line 1455, in <module>
> main()
> File "/home/ec2-user/spark-1.4.0/ec2/spark_ec2.py", line 1447, in main
> real_main()
> File "/home/ec2-user/spark-1.4.0/ec2/spark_ec2.py", line 1283, in real_main
> setup_cluster(conn, master_nodes, slave_nodes, opts, True)
> File "/home/ec2-user/spark-1.4.0/ec2/spark_ec2.py", line 785, in setup_cluster
> modules=modules
> File "/home/ec2-user/spark-1.4.0/ec2/spark_ec2.py", line 1049, in deploy_files
> subprocess.check_call(command)
> File "/usr/lib64/python2.7/subprocess.py", line 540, in check_call
> raise CalledProcessError(retcode, cmd)
> subprocess.CalledProcessError: Command '['rsync', '-rv', '-e', 'ssh -o StrictHos tKeyChecking=no -o UserKnownHostsFile=/dev/null -i /home/ec2-user/.ssh/sparkclus terkey_us_east.pem', '/tmp/tmpT4Iw54/', u'ec2-user@ec2-52-2-96-193.compute-1.ama zonaws.com:/']' returned non-zero exit status 23
> Is there a workaround for this? I want to improve security of our operations by avoiding user root on the instances.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org