You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Andrew Onischuk (JIRA)" <ji...@apache.org> on 2015/02/13 12:20:11 UTC

[jira] [Created] (AMBARI-9618) Add checks in Ambaripreupload.py so that there is minimum chances of tarball uploads in parallel.

Andrew Onischuk created AMBARI-9618:
---------------------------------------

             Summary: Add checks in Ambaripreupload.py so that there is minimum chances of tarball uploads in parallel.
                 Key: AMBARI-9618
                 URL: https://issues.apache.org/jira/browse/AMBARI-9618
             Project: Ambari
          Issue Type: Bug
            Reporter: Andrew Onischuk
            Assignee: Andrew Onischuk
             Fix For: 2.0.0


Add checks in Ambaripreupload.py so that there is minimum chances of tarball
uploads in parallel.

We should add the following checks in:

    
    
    
    
    if dir exists wasb:///hdp/apps/{{ hdp_stack_version }}/mapreduce/:
     copy_tarballs_to_hdfs("/usr/hdp/current/hadoop-client/mapreduce.tar.gz", "wasb:///hdp/apps/{{ hdp_stack_version }}/mapreduce/", 'hadoop-mapreduce-historyserver', params.mapred_user, params.hdfs_user, params.user_group)
    
    copy_tarballs_to_hdfs("/usr/hdp/current/sqoop-client/sqoop.tar.gz", "wasb:///hdp/apps/{{ hdp_stack_version }}/sqoop/", 'hadoop-mapreduce-historyserver', params.mapred_user, params.hdfs_user, params.user_group)
    
    
    

We should do this with all copy tarballs to hdfs in the file - meaning checks
if the directories exist and then only copy the tarballs.

The reason for doing this is so that when Ambari is doing the install/upload
tarballs there is less chances of the ambaripreupload.py script creating a
race condition on tarball upload.





--
This message was sent by Atlassian JIRA
(v6.3.4#6332)