You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by Alejandro Fernandez <af...@hortonworks.com> on 2014/10/22 01:56:49 UTC

Re: Review Request 26965: WebHCat to support versioned rpms in Ambari

-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26965/
-----------------------------------------------------------

(Updated Oct. 21, 2014, 11:56 p.m.)


Review request for Ambari, Dmytro Sen, Sumit Mohanty, and Sid Wagle.


Bugs: AMBARI-7892
    https://issues.apache.org/jira/browse/AMBARI-7892


Repository: ambari


Description
-------

This is related to AMBARI-7842
WebHCat relies on the following tarballs/jars

|| File || Property ||
| pig-*.tar.gz | templeton.pig.archive |
|hive-*tar.gz | templeton.hive.archive|
| sqoop-*tar.gz | templeton.sqoop.archive|
|hadoop-streaming-*.jar | templeton.streaming.jar|

All of these need to be copied to HDFS, and the name of the file needs to be injected into the property with the fully qualified path in HDFS.


Diffs
-----

  ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py efe7e63 
  ambari-common/src/main/python/resource_management/libraries/functions/version.py PRE-CREATION 
  ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/configuration/webhcat-site.xml 0523dab 
  ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py 7c86070 
  ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py 4aad1a2 
  ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py 71989c3 
  ambari-server/src/main/resources/stacks/HDP/2.2/configuration/cluster-env.xml cc52fe3 
  ambari-server/src/main/resources/stacks/HDP/2.2/services/HIVE/configuration/webhcat-site.xml 3435a63 
  ambari-server/src/test/python/TestVersion.py PRE-CREATION 
  ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py 5f92a2d 
  ambari-server/src/test/python/stacks/2.0.6/configs/default.json 5e3bad0 
  ambari-server/src/test/python/stacks/2.0.6/configs/secured.json d65b0ee 
  ambari-server/src/test/python/stacks/2.2/configs/default.json ea474e8 
  ambari-server/src/test/python/stacks/2.2/configs/secured.json 20678fa 

Diff: https://reviews.apache.org/r/26965/diff/


Testing
-------

Ran ambari-server unit tests,
----------------------------------------------------------------------
Total run:667
Total errors:0
Total failures:0
OK

And verified on cluster using the following steps.

1. Set properties
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_source                         "/usr/hdp/current/hive-client/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_destination_folder             "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_source                          "/usr/hdp/current/pig-client/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_destination_folder              "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_source                        "/usr/hdp/current/sqoop-client/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_destination_folder            "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_source             "/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_destination_folder "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.jar                  "/usr/hdp/current/hive-webhcat/share/webhcat/svr/lib/hive-webhcat-*.jar"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.pig.archive          "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.hive.archive         "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.sqoop.archive        "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.streaming.jar        "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"

2. Verified properties were saved
http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=cluster-env
http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=webhcat-site

3. Copy changed files
yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/script/config_dictionary.py                   /usr/lib/ambari-server/lib/resource_management/libraries/script/config_dictionary.py
yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/version.py                          /usr/lib/ambari-server/lib/resource_management/libraries/functions/version.py
yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py  /usr/lib/ambari-server/lib/resource_management/libraries/functions/dynamic_variable_interpretation.py
yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py                /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py
yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py                 /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py
yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py                    /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/TEZ/package/scripts/params.py

4. Check that tarballs are not already in HDFS. If they are, delete them.
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -rm -r /hdp/apps/2.2.0.0-974/

5. Before starting WebHCat, check webhcat-site.xml for properties that should be unversioned
less /etc/hive-webhcat/conf/webhcat-site.xml
/ templeton.*archive
/ templeton.*jar

6. Restart WebHCat and verify files are copied to HDFS,
python /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat_server.py START /var/lib/ambari-agent/data/command-102.json /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS /var/lib/ambari-agent/data/output-102.txt DEBUG /var/lib/ambari-agent/data/tmp
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/

7. Verify that webhcat-site.xml has properties with actual values this time.

Check the /etc/hive-webhcat/conf/webhcat-site.xml file again. This time it should have the properties with the versioned paths.


Thanks,

Alejandro Fernandez


Re: Review Request 26965: WebHCat to support versioned rpms in Ambari

Posted by Jonathan Hurley <jh...@hortonworks.com>.

> On Oct. 22, 2014, 8:41 a.m., Jonathan Hurley wrote:
> > ambari-common/src/main/python/resource_management/libraries/functions/version.py, line 55
> > <https://reviews.apache.org/r/26965/diff/5/?file=728338#file728338line55>
> >
> >     What happens if I pass in 2.10 and 2.9.9 ? Would this not indicate that 2.10 is < 2.9.9 ?
> 
> Alejandro Fernandez wrote:
>     There's another function in this file to format the strings, that would convert 2.10 to 2.10.0.0 (assuming it is an HDP stack version).

>>> cmp("2.10.0.0","2.9.0.0")
-1

OK, so 2.10.0.0 is still "less than" 2.9.0.0, right? Normalization will create the proper 4-version string, but the cmp method won't properly understand .10 vs .9


- Jonathan


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26965/#review57775
-----------------------------------------------------------


On Oct. 21, 2014, 8:54 p.m., Alejandro Fernandez wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26965/
> -----------------------------------------------------------
> 
> (Updated Oct. 21, 2014, 8:54 p.m.)
> 
> 
> Review request for Ambari, Andrew Onischuk, Dmitro Lisnichenko, Dmytro Sen, Jonathan Hurley, Nate Cole, Sumit Mohanty, Srimanth Gunturi, Sid Wagle, and Yusaku Sako.
> 
> 
> Bugs: AMBARI-7892
>     https://issues.apache.org/jira/browse/AMBARI-7892
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> This is related to AMBARI-7842
> WebHCat relies on the following tarballs/jars
> 
> || File || Property ||
> | pig-*.tar.gz | templeton.pig.archive |
> |hive-*tar.gz | templeton.hive.archive|
> | sqoop-*tar.gz | templeton.sqoop.archive|
> |hadoop-streaming-*.jar | templeton.streaming.jar|
> 
> All of these need to be copied to HDFS, and the name of the file needs to be injected into the property with the fully qualified path in HDFS.
> 
> 
> Diffs
> -----
> 
>   ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py efe7e63 
>   ambari-common/src/main/python/resource_management/libraries/functions/version.py PRE-CREATION 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/configuration/webhcat-site.xml 0523dab 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py 7c86070 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py 4aad1a2 
>   ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py 71989c3 
>   ambari-server/src/main/resources/stacks/HDP/2.2/configuration/cluster-env.xml cc52fe3 
>   ambari-server/src/main/resources/stacks/HDP/2.2/services/HIVE/configuration/webhcat-site.xml 3435a63 
>   ambari-server/src/test/python/TestVersion.py PRE-CREATION 
>   ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py 5f92a2d 
>   ambari-server/src/test/python/stacks/2.0.6/configs/default.json 5e3bad0 
>   ambari-server/src/test/python/stacks/2.0.6/configs/secured.json d65b0ee 
>   ambari-server/src/test/python/stacks/2.2/configs/default.json ea474e8 
>   ambari-server/src/test/python/stacks/2.2/configs/secured.json 20678fa 
> 
> Diff: https://reviews.apache.org/r/26965/diff/
> 
> 
> Testing
> -------
> 
> Ran ambari-server unit tests,
> ----------------------------------------------------------------------
> Total run:667
> Total errors:0
> Total failures:0
> OK
> 
> And verified on cluster using the following steps.
> 
> 1. Set properties
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_source                         "/usr/hdp/current/hive-client/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_destination_folder             "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_source                          "/usr/hdp/current/pig-client/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_destination_folder              "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_source                        "/usr/hdp/current/sqoop-client/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_destination_folder            "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_source             "/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_destination_folder "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.jar                  "/usr/hdp/current/hive-webhcat/share/webhcat/svr/lib/hive-webhcat-*.jar"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.pig.archive          "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.hive.archive         "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.sqoop.archive        "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.streaming.jar        "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
> 
> 2. Verified properties were saved
> http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=cluster-env
> http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=webhcat-site
> 
> 3. Copy changed files
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/script/config_dictionary.py                   /usr/lib/ambari-server/lib/resource_management/libraries/script/config_dictionary.py
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/version.py                          /usr/lib/ambari-server/lib/resource_management/libraries/functions/version.py
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py  /usr/lib/ambari-server/lib/resource_management/libraries/functions/dynamic_variable_interpretation.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py                /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py                 /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py                    /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/TEZ/package/scripts/params.py
> 
> 4. Check that tarballs are not already in HDFS. If they are, delete them.
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -rm -r /hdp/apps/2.2.0.0-974/
> 
> 5. Before starting WebHCat, check webhcat-site.xml for properties that should be unversioned
> less /etc/hive-webhcat/conf/webhcat-site.xml
> / templeton.*archive
> / templeton.*jar
> 
> 6. Restart WebHCat and verify files are copied to HDFS,
> python /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat_server.py START /var/lib/ambari-agent/data/command-102.json /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS /var/lib/ambari-agent/data/output-102.txt DEBUG /var/lib/ambari-agent/data/tmp
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
> 
> 7. Verify that webhcat-site.xml has properties with actual values this time.
> 
> Check the /etc/hive-webhcat/conf/webhcat-site.xml file again. This time it should have the properties with the versioned paths.
> 
> 
> Thanks,
> 
> Alejandro Fernandez
> 
>


Re: Review Request 26965: WebHCat to support versioned rpms in Ambari

Posted by Alejandro Fernandez <af...@hortonworks.com>.

> On Oct. 22, 2014, 12:41 p.m., Jonathan Hurley wrote:
> > ambari-common/src/main/python/resource_management/libraries/functions/version.py, line 40
> > <https://reviews.apache.org/r/26965/diff/5/?file=728338#file728338line40>
> >
> >     Extra comma?

Typical notation to indicate that it is a list with one element.


- Alejandro


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26965/#review57775
-----------------------------------------------------------


On Oct. 22, 2014, 12:54 a.m., Alejandro Fernandez wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26965/
> -----------------------------------------------------------
> 
> (Updated Oct. 22, 2014, 12:54 a.m.)
> 
> 
> Review request for Ambari, Andrew Onischuk, Dmitro Lisnichenko, Dmytro Sen, Jonathan Hurley, Nate Cole, Sumit Mohanty, Srimanth Gunturi, Sid Wagle, and Yusaku Sako.
> 
> 
> Bugs: AMBARI-7892
>     https://issues.apache.org/jira/browse/AMBARI-7892
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> This is related to AMBARI-7842
> WebHCat relies on the following tarballs/jars
> 
> || File || Property ||
> | pig-*.tar.gz | templeton.pig.archive |
> |hive-*tar.gz | templeton.hive.archive|
> | sqoop-*tar.gz | templeton.sqoop.archive|
> |hadoop-streaming-*.jar | templeton.streaming.jar|
> 
> All of these need to be copied to HDFS, and the name of the file needs to be injected into the property with the fully qualified path in HDFS.
> 
> 
> Diffs
> -----
> 
>   ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py efe7e63 
>   ambari-common/src/main/python/resource_management/libraries/functions/version.py PRE-CREATION 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/configuration/webhcat-site.xml 0523dab 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py 7c86070 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py 4aad1a2 
>   ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py 71989c3 
>   ambari-server/src/main/resources/stacks/HDP/2.2/configuration/cluster-env.xml cc52fe3 
>   ambari-server/src/main/resources/stacks/HDP/2.2/services/HIVE/configuration/webhcat-site.xml 3435a63 
>   ambari-server/src/test/python/TestVersion.py PRE-CREATION 
>   ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py 5f92a2d 
>   ambari-server/src/test/python/stacks/2.0.6/configs/default.json 5e3bad0 
>   ambari-server/src/test/python/stacks/2.0.6/configs/secured.json d65b0ee 
>   ambari-server/src/test/python/stacks/2.2/configs/default.json ea474e8 
>   ambari-server/src/test/python/stacks/2.2/configs/secured.json 20678fa 
> 
> Diff: https://reviews.apache.org/r/26965/diff/
> 
> 
> Testing
> -------
> 
> Ran ambari-server unit tests,
> ----------------------------------------------------------------------
> Total run:667
> Total errors:0
> Total failures:0
> OK
> 
> And verified on cluster using the following steps.
> 
> 1. Set properties
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_source                         "/usr/hdp/current/hive-client/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_destination_folder             "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_source                          "/usr/hdp/current/pig-client/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_destination_folder              "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_source                        "/usr/hdp/current/sqoop-client/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_destination_folder            "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_source             "/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_destination_folder "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.jar                  "/usr/hdp/current/hive-webhcat/share/webhcat/svr/lib/hive-webhcat-*.jar"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.pig.archive          "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.hive.archive         "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.sqoop.archive        "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.streaming.jar        "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
> 
> 2. Verified properties were saved
> http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=cluster-env
> http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=webhcat-site
> 
> 3. Copy changed files
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/script/config_dictionary.py                   /usr/lib/ambari-server/lib/resource_management/libraries/script/config_dictionary.py
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/version.py                          /usr/lib/ambari-server/lib/resource_management/libraries/functions/version.py
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py  /usr/lib/ambari-server/lib/resource_management/libraries/functions/dynamic_variable_interpretation.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py                /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py                 /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py                    /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/TEZ/package/scripts/params.py
> 
> 4. Check that tarballs are not already in HDFS. If they are, delete them.
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -rm -r /hdp/apps/2.2.0.0-974/
> 
> 5. Before starting WebHCat, check webhcat-site.xml for properties that should be unversioned
> less /etc/hive-webhcat/conf/webhcat-site.xml
> / templeton.*archive
> / templeton.*jar
> 
> 6. Restart WebHCat and verify files are copied to HDFS,
> python /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat_server.py START /var/lib/ambari-agent/data/command-102.json /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS /var/lib/ambari-agent/data/output-102.txt DEBUG /var/lib/ambari-agent/data/tmp
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
> 
> 7. Verify that webhcat-site.xml has properties with actual values this time.
> 
> Check the /etc/hive-webhcat/conf/webhcat-site.xml file again. This time it should have the properties with the versioned paths.
> 
> 
> Thanks,
> 
> Alejandro Fernandez
> 
>


Re: Review Request 26965: WebHCat to support versioned rpms in Ambari

Posted by Alejandro Fernandez <af...@hortonworks.com>.

> On Oct. 22, 2014, 12:41 p.m., Jonathan Hurley wrote:
> > ambari-common/src/main/python/resource_management/libraries/functions/version.py, line 25
> > <https://reviews.apache.org/r/26965/diff/5/?file=728338#file728338line25>
> >
> >     I normally try to avoid the use of double underscores since the name mangling can cause problems. If this package is not going to be subclassed (since there's no class, I would assume that's the case) then I think a single underscore is enough.

Will do this in the next patch.


> On Oct. 22, 2014, 12:41 p.m., Jonathan Hurley wrote:
> > ambari-common/src/main/python/resource_management/libraries/functions/version.py, line 55
> > <https://reviews.apache.org/r/26965/diff/5/?file=728338#file728338line55>
> >
> >     What happens if I pass in 2.10 and 2.9.9 ? Would this not indicate that 2.10 is < 2.9.9 ?
> 
> Alejandro Fernandez wrote:
>     There's another function in this file to format the strings, that would convert 2.10 to 2.10.0.0 (assuming it is an HDP stack version).
> 
> Jonathan Hurley wrote:
>     >>> cmp("2.10.0.0","2.9.0.0")
>     -1
>     
>     OK, so 2.10.0.0 is still "less than" 2.9.0.0, right? Normalization will create the proper 4-version string, but the cmp method won't properly understand .10 vs .9
> 
> Alejandro Fernandez wrote:
>     Actually, I can format the versions to have the same number of dots.

cmp is actually on ([2, 10, 0, 0], [2, 9, 0, 0])


- Alejandro


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26965/#review57775
-----------------------------------------------------------


On Oct. 22, 2014, 12:54 a.m., Alejandro Fernandez wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26965/
> -----------------------------------------------------------
> 
> (Updated Oct. 22, 2014, 12:54 a.m.)
> 
> 
> Review request for Ambari, Andrew Onischuk, Dmitro Lisnichenko, Dmytro Sen, Jonathan Hurley, Nate Cole, Sumit Mohanty, Srimanth Gunturi, Sid Wagle, and Yusaku Sako.
> 
> 
> Bugs: AMBARI-7892
>     https://issues.apache.org/jira/browse/AMBARI-7892
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> This is related to AMBARI-7842
> WebHCat relies on the following tarballs/jars
> 
> || File || Property ||
> | pig-*.tar.gz | templeton.pig.archive |
> |hive-*tar.gz | templeton.hive.archive|
> | sqoop-*tar.gz | templeton.sqoop.archive|
> |hadoop-streaming-*.jar | templeton.streaming.jar|
> 
> All of these need to be copied to HDFS, and the name of the file needs to be injected into the property with the fully qualified path in HDFS.
> 
> 
> Diffs
> -----
> 
>   ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py efe7e63 
>   ambari-common/src/main/python/resource_management/libraries/functions/version.py PRE-CREATION 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/configuration/webhcat-site.xml 0523dab 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py 7c86070 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py 4aad1a2 
>   ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py 71989c3 
>   ambari-server/src/main/resources/stacks/HDP/2.2/configuration/cluster-env.xml cc52fe3 
>   ambari-server/src/main/resources/stacks/HDP/2.2/services/HIVE/configuration/webhcat-site.xml 3435a63 
>   ambari-server/src/test/python/TestVersion.py PRE-CREATION 
>   ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py 5f92a2d 
>   ambari-server/src/test/python/stacks/2.0.6/configs/default.json 5e3bad0 
>   ambari-server/src/test/python/stacks/2.0.6/configs/secured.json d65b0ee 
>   ambari-server/src/test/python/stacks/2.2/configs/default.json ea474e8 
>   ambari-server/src/test/python/stacks/2.2/configs/secured.json 20678fa 
> 
> Diff: https://reviews.apache.org/r/26965/diff/
> 
> 
> Testing
> -------
> 
> Ran ambari-server unit tests,
> ----------------------------------------------------------------------
> Total run:667
> Total errors:0
> Total failures:0
> OK
> 
> And verified on cluster using the following steps.
> 
> 1. Set properties
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_source                         "/usr/hdp/current/hive-client/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_destination_folder             "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_source                          "/usr/hdp/current/pig-client/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_destination_folder              "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_source                        "/usr/hdp/current/sqoop-client/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_destination_folder            "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_source             "/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_destination_folder "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.jar                  "/usr/hdp/current/hive-webhcat/share/webhcat/svr/lib/hive-webhcat-*.jar"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.pig.archive          "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.hive.archive         "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.sqoop.archive        "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.streaming.jar        "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
> 
> 2. Verified properties were saved
> http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=cluster-env
> http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=webhcat-site
> 
> 3. Copy changed files
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/script/config_dictionary.py                   /usr/lib/ambari-server/lib/resource_management/libraries/script/config_dictionary.py
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/version.py                          /usr/lib/ambari-server/lib/resource_management/libraries/functions/version.py
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py  /usr/lib/ambari-server/lib/resource_management/libraries/functions/dynamic_variable_interpretation.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py                /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py                 /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py                    /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/TEZ/package/scripts/params.py
> 
> 4. Check that tarballs are not already in HDFS. If they are, delete them.
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -rm -r /hdp/apps/2.2.0.0-974/
> 
> 5. Before starting WebHCat, check webhcat-site.xml for properties that should be unversioned
> less /etc/hive-webhcat/conf/webhcat-site.xml
> / templeton.*archive
> / templeton.*jar
> 
> 6. Restart WebHCat and verify files are copied to HDFS,
> python /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat_server.py START /var/lib/ambari-agent/data/command-102.json /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS /var/lib/ambari-agent/data/output-102.txt DEBUG /var/lib/ambari-agent/data/tmp
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
> 
> 7. Verify that webhcat-site.xml has properties with actual values this time.
> 
> Check the /etc/hive-webhcat/conf/webhcat-site.xml file again. This time it should have the properties with the versioned paths.
> 
> 
> Thanks,
> 
> Alejandro Fernandez
> 
>


Re: Review Request 26965: WebHCat to support versioned rpms in Ambari

Posted by Alejandro Fernandez <af...@hortonworks.com>.

> On Oct. 22, 2014, 12:41 p.m., Jonathan Hurley wrote:
> > ambari-common/src/main/python/resource_management/libraries/functions/version.py, line 55
> > <https://reviews.apache.org/r/26965/diff/5/?file=728338#file728338line55>
> >
> >     What happens if I pass in 2.10 and 2.9.9 ? Would this not indicate that 2.10 is < 2.9.9 ?
> 
> Alejandro Fernandez wrote:
>     There's another function in this file to format the strings, that would convert 2.10 to 2.10.0.0 (assuming it is an HDP stack version).
> 
> Jonathan Hurley wrote:
>     >>> cmp("2.10.0.0","2.9.0.0")
>     -1
>     
>     OK, so 2.10.0.0 is still "less than" 2.9.0.0, right? Normalization will create the proper 4-version string, but the cmp method won't properly understand .10 vs .9

Actually, I can format the versions to have the same number of dots.


- Alejandro


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26965/#review57775
-----------------------------------------------------------


On Oct. 22, 2014, 12:54 a.m., Alejandro Fernandez wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26965/
> -----------------------------------------------------------
> 
> (Updated Oct. 22, 2014, 12:54 a.m.)
> 
> 
> Review request for Ambari, Andrew Onischuk, Dmitro Lisnichenko, Dmytro Sen, Jonathan Hurley, Nate Cole, Sumit Mohanty, Srimanth Gunturi, Sid Wagle, and Yusaku Sako.
> 
> 
> Bugs: AMBARI-7892
>     https://issues.apache.org/jira/browse/AMBARI-7892
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> This is related to AMBARI-7842
> WebHCat relies on the following tarballs/jars
> 
> || File || Property ||
> | pig-*.tar.gz | templeton.pig.archive |
> |hive-*tar.gz | templeton.hive.archive|
> | sqoop-*tar.gz | templeton.sqoop.archive|
> |hadoop-streaming-*.jar | templeton.streaming.jar|
> 
> All of these need to be copied to HDFS, and the name of the file needs to be injected into the property with the fully qualified path in HDFS.
> 
> 
> Diffs
> -----
> 
>   ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py efe7e63 
>   ambari-common/src/main/python/resource_management/libraries/functions/version.py PRE-CREATION 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/configuration/webhcat-site.xml 0523dab 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py 7c86070 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py 4aad1a2 
>   ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py 71989c3 
>   ambari-server/src/main/resources/stacks/HDP/2.2/configuration/cluster-env.xml cc52fe3 
>   ambari-server/src/main/resources/stacks/HDP/2.2/services/HIVE/configuration/webhcat-site.xml 3435a63 
>   ambari-server/src/test/python/TestVersion.py PRE-CREATION 
>   ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py 5f92a2d 
>   ambari-server/src/test/python/stacks/2.0.6/configs/default.json 5e3bad0 
>   ambari-server/src/test/python/stacks/2.0.6/configs/secured.json d65b0ee 
>   ambari-server/src/test/python/stacks/2.2/configs/default.json ea474e8 
>   ambari-server/src/test/python/stacks/2.2/configs/secured.json 20678fa 
> 
> Diff: https://reviews.apache.org/r/26965/diff/
> 
> 
> Testing
> -------
> 
> Ran ambari-server unit tests,
> ----------------------------------------------------------------------
> Total run:667
> Total errors:0
> Total failures:0
> OK
> 
> And verified on cluster using the following steps.
> 
> 1. Set properties
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_source                         "/usr/hdp/current/hive-client/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_destination_folder             "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_source                          "/usr/hdp/current/pig-client/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_destination_folder              "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_source                        "/usr/hdp/current/sqoop-client/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_destination_folder            "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_source             "/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_destination_folder "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.jar                  "/usr/hdp/current/hive-webhcat/share/webhcat/svr/lib/hive-webhcat-*.jar"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.pig.archive          "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.hive.archive         "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.sqoop.archive        "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.streaming.jar        "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
> 
> 2. Verified properties were saved
> http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=cluster-env
> http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=webhcat-site
> 
> 3. Copy changed files
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/script/config_dictionary.py                   /usr/lib/ambari-server/lib/resource_management/libraries/script/config_dictionary.py
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/version.py                          /usr/lib/ambari-server/lib/resource_management/libraries/functions/version.py
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py  /usr/lib/ambari-server/lib/resource_management/libraries/functions/dynamic_variable_interpretation.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py                /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py                 /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py                    /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/TEZ/package/scripts/params.py
> 
> 4. Check that tarballs are not already in HDFS. If they are, delete them.
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -rm -r /hdp/apps/2.2.0.0-974/
> 
> 5. Before starting WebHCat, check webhcat-site.xml for properties that should be unversioned
> less /etc/hive-webhcat/conf/webhcat-site.xml
> / templeton.*archive
> / templeton.*jar
> 
> 6. Restart WebHCat and verify files are copied to HDFS,
> python /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat_server.py START /var/lib/ambari-agent/data/command-102.json /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS /var/lib/ambari-agent/data/output-102.txt DEBUG /var/lib/ambari-agent/data/tmp
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
> 
> 7. Verify that webhcat-site.xml has properties with actual values this time.
> 
> Check the /etc/hive-webhcat/conf/webhcat-site.xml file again. This time it should have the properties with the versioned paths.
> 
> 
> Thanks,
> 
> Alejandro Fernandez
> 
>


Re: Review Request 26965: WebHCat to support versioned rpms in Ambari

Posted by Alejandro Fernandez <af...@hortonworks.com>.

> On Oct. 22, 2014, 12:41 p.m., Jonathan Hurley wrote:
> > ambari-common/src/main/python/resource_management/libraries/functions/version.py, line 55
> > <https://reviews.apache.org/r/26965/diff/5/?file=728338#file728338line55>
> >
> >     What happens if I pass in 2.10 and 2.9.9 ? Would this not indicate that 2.10 is < 2.9.9 ?

There's another function in this file to format the strings, that would convert 2.10 to 2.10.0.0 (assuming it is an HDP stack version).


- Alejandro


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26965/#review57775
-----------------------------------------------------------


On Oct. 22, 2014, 12:54 a.m., Alejandro Fernandez wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26965/
> -----------------------------------------------------------
> 
> (Updated Oct. 22, 2014, 12:54 a.m.)
> 
> 
> Review request for Ambari, Andrew Onischuk, Dmitro Lisnichenko, Dmytro Sen, Jonathan Hurley, Nate Cole, Sumit Mohanty, Srimanth Gunturi, Sid Wagle, and Yusaku Sako.
> 
> 
> Bugs: AMBARI-7892
>     https://issues.apache.org/jira/browse/AMBARI-7892
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> This is related to AMBARI-7842
> WebHCat relies on the following tarballs/jars
> 
> || File || Property ||
> | pig-*.tar.gz | templeton.pig.archive |
> |hive-*tar.gz | templeton.hive.archive|
> | sqoop-*tar.gz | templeton.sqoop.archive|
> |hadoop-streaming-*.jar | templeton.streaming.jar|
> 
> All of these need to be copied to HDFS, and the name of the file needs to be injected into the property with the fully qualified path in HDFS.
> 
> 
> Diffs
> -----
> 
>   ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py efe7e63 
>   ambari-common/src/main/python/resource_management/libraries/functions/version.py PRE-CREATION 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/configuration/webhcat-site.xml 0523dab 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py 7c86070 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py 4aad1a2 
>   ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py 71989c3 
>   ambari-server/src/main/resources/stacks/HDP/2.2/configuration/cluster-env.xml cc52fe3 
>   ambari-server/src/main/resources/stacks/HDP/2.2/services/HIVE/configuration/webhcat-site.xml 3435a63 
>   ambari-server/src/test/python/TestVersion.py PRE-CREATION 
>   ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py 5f92a2d 
>   ambari-server/src/test/python/stacks/2.0.6/configs/default.json 5e3bad0 
>   ambari-server/src/test/python/stacks/2.0.6/configs/secured.json d65b0ee 
>   ambari-server/src/test/python/stacks/2.2/configs/default.json ea474e8 
>   ambari-server/src/test/python/stacks/2.2/configs/secured.json 20678fa 
> 
> Diff: https://reviews.apache.org/r/26965/diff/
> 
> 
> Testing
> -------
> 
> Ran ambari-server unit tests,
> ----------------------------------------------------------------------
> Total run:667
> Total errors:0
> Total failures:0
> OK
> 
> And verified on cluster using the following steps.
> 
> 1. Set properties
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_source                         "/usr/hdp/current/hive-client/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_destination_folder             "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_source                          "/usr/hdp/current/pig-client/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_destination_folder              "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_source                        "/usr/hdp/current/sqoop-client/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_destination_folder            "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_source             "/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_destination_folder "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.jar                  "/usr/hdp/current/hive-webhcat/share/webhcat/svr/lib/hive-webhcat-*.jar"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.pig.archive          "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.hive.archive         "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.sqoop.archive        "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.streaming.jar        "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
> 
> 2. Verified properties were saved
> http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=cluster-env
> http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=webhcat-site
> 
> 3. Copy changed files
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/script/config_dictionary.py                   /usr/lib/ambari-server/lib/resource_management/libraries/script/config_dictionary.py
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/version.py                          /usr/lib/ambari-server/lib/resource_management/libraries/functions/version.py
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py  /usr/lib/ambari-server/lib/resource_management/libraries/functions/dynamic_variable_interpretation.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py                /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py                 /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py                    /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/TEZ/package/scripts/params.py
> 
> 4. Check that tarballs are not already in HDFS. If they are, delete them.
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -rm -r /hdp/apps/2.2.0.0-974/
> 
> 5. Before starting WebHCat, check webhcat-site.xml for properties that should be unversioned
> less /etc/hive-webhcat/conf/webhcat-site.xml
> / templeton.*archive
> / templeton.*jar
> 
> 6. Restart WebHCat and verify files are copied to HDFS,
> python /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat_server.py START /var/lib/ambari-agent/data/command-102.json /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS /var/lib/ambari-agent/data/output-102.txt DEBUG /var/lib/ambari-agent/data/tmp
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
> 
> 7. Verify that webhcat-site.xml has properties with actual values this time.
> 
> Check the /etc/hive-webhcat/conf/webhcat-site.xml file again. This time it should have the properties with the versioned paths.
> 
> 
> Thanks,
> 
> Alejandro Fernandez
> 
>


Re: Review Request 26965: WebHCat to support versioned rpms in Ambari

Posted by Jonathan Hurley <jh...@hortonworks.com>.

> On Oct. 22, 2014, 8:41 a.m., Jonathan Hurley wrote:
> > ambari-common/src/main/python/resource_management/libraries/functions/version.py, line 55
> > <https://reviews.apache.org/r/26965/diff/5/?file=728338#file728338line55>
> >
> >     What happens if I pass in 2.10 and 2.9.9 ? Would this not indicate that 2.10 is < 2.9.9 ?
> 
> Alejandro Fernandez wrote:
>     There's another function in this file to format the strings, that would convert 2.10 to 2.10.0.0 (assuming it is an HDP stack version).
> 
> Jonathan Hurley wrote:
>     >>> cmp("2.10.0.0","2.9.0.0")
>     -1
>     
>     OK, so 2.10.0.0 is still "less than" 2.9.0.0, right? Normalization will create the proper 4-version string, but the cmp method won't properly understand .10 vs .9
> 
> Alejandro Fernandez wrote:
>     Actually, I can format the versions to have the same number of dots.
> 
> Alejandro Fernandez wrote:
>     cmp is actually on ([2, 10, 0, 0], [2, 9, 0, 0])

Thanks for explaining that!


> On Oct. 22, 2014, 8:41 a.m., Jonathan Hurley wrote:
> > ambari-common/src/main/python/resource_management/libraries/functions/version.py, line 40
> > <https://reviews.apache.org/r/26965/diff/5/?file=728338#file728338line40>
> >
> >     Extra comma?
> 
> Alejandro Fernandez wrote:
>     Typical notation to indicate that it is a list with one element.

Gotcha, you want to return a tuple with 1 element.


- Jonathan


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26965/#review57775
-----------------------------------------------------------


On Oct. 21, 2014, 8:54 p.m., Alejandro Fernandez wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26965/
> -----------------------------------------------------------
> 
> (Updated Oct. 21, 2014, 8:54 p.m.)
> 
> 
> Review request for Ambari, Andrew Onischuk, Dmitro Lisnichenko, Dmytro Sen, Jonathan Hurley, Nate Cole, Sumit Mohanty, Srimanth Gunturi, Sid Wagle, and Yusaku Sako.
> 
> 
> Bugs: AMBARI-7892
>     https://issues.apache.org/jira/browse/AMBARI-7892
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> This is related to AMBARI-7842
> WebHCat relies on the following tarballs/jars
> 
> || File || Property ||
> | pig-*.tar.gz | templeton.pig.archive |
> |hive-*tar.gz | templeton.hive.archive|
> | sqoop-*tar.gz | templeton.sqoop.archive|
> |hadoop-streaming-*.jar | templeton.streaming.jar|
> 
> All of these need to be copied to HDFS, and the name of the file needs to be injected into the property with the fully qualified path in HDFS.
> 
> 
> Diffs
> -----
> 
>   ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py efe7e63 
>   ambari-common/src/main/python/resource_management/libraries/functions/version.py PRE-CREATION 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/configuration/webhcat-site.xml 0523dab 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py 7c86070 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py 4aad1a2 
>   ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py 71989c3 
>   ambari-server/src/main/resources/stacks/HDP/2.2/configuration/cluster-env.xml cc52fe3 
>   ambari-server/src/main/resources/stacks/HDP/2.2/services/HIVE/configuration/webhcat-site.xml 3435a63 
>   ambari-server/src/test/python/TestVersion.py PRE-CREATION 
>   ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py 5f92a2d 
>   ambari-server/src/test/python/stacks/2.0.6/configs/default.json 5e3bad0 
>   ambari-server/src/test/python/stacks/2.0.6/configs/secured.json d65b0ee 
>   ambari-server/src/test/python/stacks/2.2/configs/default.json ea474e8 
>   ambari-server/src/test/python/stacks/2.2/configs/secured.json 20678fa 
> 
> Diff: https://reviews.apache.org/r/26965/diff/
> 
> 
> Testing
> -------
> 
> Ran ambari-server unit tests,
> ----------------------------------------------------------------------
> Total run:667
> Total errors:0
> Total failures:0
> OK
> 
> And verified on cluster using the following steps.
> 
> 1. Set properties
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_source                         "/usr/hdp/current/hive-client/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_destination_folder             "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_source                          "/usr/hdp/current/pig-client/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_destination_folder              "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_source                        "/usr/hdp/current/sqoop-client/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_destination_folder            "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_source             "/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_destination_folder "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.jar                  "/usr/hdp/current/hive-webhcat/share/webhcat/svr/lib/hive-webhcat-*.jar"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.pig.archive          "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.hive.archive         "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.sqoop.archive        "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.streaming.jar        "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
> 
> 2. Verified properties were saved
> http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=cluster-env
> http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=webhcat-site
> 
> 3. Copy changed files
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/script/config_dictionary.py                   /usr/lib/ambari-server/lib/resource_management/libraries/script/config_dictionary.py
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/version.py                          /usr/lib/ambari-server/lib/resource_management/libraries/functions/version.py
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py  /usr/lib/ambari-server/lib/resource_management/libraries/functions/dynamic_variable_interpretation.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py                /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py                 /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py                    /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/TEZ/package/scripts/params.py
> 
> 4. Check that tarballs are not already in HDFS. If they are, delete them.
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -rm -r /hdp/apps/2.2.0.0-974/
> 
> 5. Before starting WebHCat, check webhcat-site.xml for properties that should be unversioned
> less /etc/hive-webhcat/conf/webhcat-site.xml
> / templeton.*archive
> / templeton.*jar
> 
> 6. Restart WebHCat and verify files are copied to HDFS,
> python /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat_server.py START /var/lib/ambari-agent/data/command-102.json /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS /var/lib/ambari-agent/data/output-102.txt DEBUG /var/lib/ambari-agent/data/tmp
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
> 
> 7. Verify that webhcat-site.xml has properties with actual values this time.
> 
> Check the /etc/hive-webhcat/conf/webhcat-site.xml file again. This time it should have the properties with the versioned paths.
> 
> 
> Thanks,
> 
> Alejandro Fernandez
> 
>


Re: Review Request 26965: WebHCat to support versioned rpms in Ambari

Posted by Jonathan Hurley <jh...@hortonworks.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26965/#review57775
-----------------------------------------------------------



ambari-common/src/main/python/resource_management/libraries/functions/version.py
<https://reviews.apache.org/r/26965/#comment98656>

    I normally try to avoid the use of double underscores since the name mangling can cause problems. If this package is not going to be subclassed (since there's no class, I would assume that's the case) then I think a single underscore is enough.



ambari-common/src/main/python/resource_management/libraries/functions/version.py
<https://reviews.apache.org/r/26965/#comment98657>

    Extra comma?



ambari-common/src/main/python/resource_management/libraries/functions/version.py
<https://reviews.apache.org/r/26965/#comment98659>

    What happens if I pass in 2.10 and 2.9.9 ? Would this not indicate that 2.10 is < 2.9.9 ?


- Jonathan Hurley


On Oct. 21, 2014, 8:54 p.m., Alejandro Fernandez wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26965/
> -----------------------------------------------------------
> 
> (Updated Oct. 21, 2014, 8:54 p.m.)
> 
> 
> Review request for Ambari, Andrew Onischuk, Dmitro Lisnichenko, Dmytro Sen, Jonathan Hurley, Nate Cole, Sumit Mohanty, Srimanth Gunturi, Sid Wagle, and Yusaku Sako.
> 
> 
> Bugs: AMBARI-7892
>     https://issues.apache.org/jira/browse/AMBARI-7892
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> This is related to AMBARI-7842
> WebHCat relies on the following tarballs/jars
> 
> || File || Property ||
> | pig-*.tar.gz | templeton.pig.archive |
> |hive-*tar.gz | templeton.hive.archive|
> | sqoop-*tar.gz | templeton.sqoop.archive|
> |hadoop-streaming-*.jar | templeton.streaming.jar|
> 
> All of these need to be copied to HDFS, and the name of the file needs to be injected into the property with the fully qualified path in HDFS.
> 
> 
> Diffs
> -----
> 
>   ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py efe7e63 
>   ambari-common/src/main/python/resource_management/libraries/functions/version.py PRE-CREATION 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/configuration/webhcat-site.xml 0523dab 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py 7c86070 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py 4aad1a2 
>   ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py 71989c3 
>   ambari-server/src/main/resources/stacks/HDP/2.2/configuration/cluster-env.xml cc52fe3 
>   ambari-server/src/main/resources/stacks/HDP/2.2/services/HIVE/configuration/webhcat-site.xml 3435a63 
>   ambari-server/src/test/python/TestVersion.py PRE-CREATION 
>   ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py 5f92a2d 
>   ambari-server/src/test/python/stacks/2.0.6/configs/default.json 5e3bad0 
>   ambari-server/src/test/python/stacks/2.0.6/configs/secured.json d65b0ee 
>   ambari-server/src/test/python/stacks/2.2/configs/default.json ea474e8 
>   ambari-server/src/test/python/stacks/2.2/configs/secured.json 20678fa 
> 
> Diff: https://reviews.apache.org/r/26965/diff/
> 
> 
> Testing
> -------
> 
> Ran ambari-server unit tests,
> ----------------------------------------------------------------------
> Total run:667
> Total errors:0
> Total failures:0
> OK
> 
> And verified on cluster using the following steps.
> 
> 1. Set properties
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_source                         "/usr/hdp/current/hive-client/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_destination_folder             "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_source                          "/usr/hdp/current/pig-client/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_destination_folder              "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_source                        "/usr/hdp/current/sqoop-client/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_destination_folder            "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_source             "/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_destination_folder "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.jar                  "/usr/hdp/current/hive-webhcat/share/webhcat/svr/lib/hive-webhcat-*.jar"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.pig.archive          "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.hive.archive         "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.sqoop.archive        "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.streaming.jar        "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
> 
> 2. Verified properties were saved
> http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=cluster-env
> http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=webhcat-site
> 
> 3. Copy changed files
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/script/config_dictionary.py                   /usr/lib/ambari-server/lib/resource_management/libraries/script/config_dictionary.py
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/version.py                          /usr/lib/ambari-server/lib/resource_management/libraries/functions/version.py
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py  /usr/lib/ambari-server/lib/resource_management/libraries/functions/dynamic_variable_interpretation.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py                /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py                 /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py                    /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/TEZ/package/scripts/params.py
> 
> 4. Check that tarballs are not already in HDFS. If they are, delete them.
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -rm -r /hdp/apps/2.2.0.0-974/
> 
> 5. Before starting WebHCat, check webhcat-site.xml for properties that should be unversioned
> less /etc/hive-webhcat/conf/webhcat-site.xml
> / templeton.*archive
> / templeton.*jar
> 
> 6. Restart WebHCat and verify files are copied to HDFS,
> python /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat_server.py START /var/lib/ambari-agent/data/command-102.json /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS /var/lib/ambari-agent/data/output-102.txt DEBUG /var/lib/ambari-agent/data/tmp
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
> 
> 7. Verify that webhcat-site.xml has properties with actual values this time.
> 
> Check the /etc/hive-webhcat/conf/webhcat-site.xml file again. This time it should have the properties with the versioned paths.
> 
> 
> Thanks,
> 
> Alejandro Fernandez
> 
>


Re: Review Request 26965: WebHCat to support versioned rpms in Ambari

Posted by Jonathan Hurley <jh...@hortonworks.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26965/#review57820
-----------------------------------------------------------

Ship it!


Thanks for the explanations of the sections I had concerns over.

- Jonathan Hurley


On Oct. 21, 2014, 8:54 p.m., Alejandro Fernandez wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26965/
> -----------------------------------------------------------
> 
> (Updated Oct. 21, 2014, 8:54 p.m.)
> 
> 
> Review request for Ambari, Andrew Onischuk, Dmitro Lisnichenko, Dmytro Sen, Jonathan Hurley, Nate Cole, Sumit Mohanty, Srimanth Gunturi, Sid Wagle, and Yusaku Sako.
> 
> 
> Bugs: AMBARI-7892
>     https://issues.apache.org/jira/browse/AMBARI-7892
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> This is related to AMBARI-7842
> WebHCat relies on the following tarballs/jars
> 
> || File || Property ||
> | pig-*.tar.gz | templeton.pig.archive |
> |hive-*tar.gz | templeton.hive.archive|
> | sqoop-*tar.gz | templeton.sqoop.archive|
> |hadoop-streaming-*.jar | templeton.streaming.jar|
> 
> All of these need to be copied to HDFS, and the name of the file needs to be injected into the property with the fully qualified path in HDFS.
> 
> 
> Diffs
> -----
> 
>   ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py efe7e63 
>   ambari-common/src/main/python/resource_management/libraries/functions/version.py PRE-CREATION 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/configuration/webhcat-site.xml 0523dab 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py 7c86070 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py 4aad1a2 
>   ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py 71989c3 
>   ambari-server/src/main/resources/stacks/HDP/2.2/configuration/cluster-env.xml cc52fe3 
>   ambari-server/src/main/resources/stacks/HDP/2.2/services/HIVE/configuration/webhcat-site.xml 3435a63 
>   ambari-server/src/test/python/TestVersion.py PRE-CREATION 
>   ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py 5f92a2d 
>   ambari-server/src/test/python/stacks/2.0.6/configs/default.json 5e3bad0 
>   ambari-server/src/test/python/stacks/2.0.6/configs/secured.json d65b0ee 
>   ambari-server/src/test/python/stacks/2.2/configs/default.json ea474e8 
>   ambari-server/src/test/python/stacks/2.2/configs/secured.json 20678fa 
> 
> Diff: https://reviews.apache.org/r/26965/diff/
> 
> 
> Testing
> -------
> 
> Ran ambari-server unit tests,
> ----------------------------------------------------------------------
> Total run:667
> Total errors:0
> Total failures:0
> OK
> 
> And verified on cluster using the following steps.
> 
> 1. Set properties
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_source                         "/usr/hdp/current/hive-client/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_destination_folder             "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_source                          "/usr/hdp/current/pig-client/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_destination_folder              "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_source                        "/usr/hdp/current/sqoop-client/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_destination_folder            "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_source             "/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_destination_folder "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.jar                  "/usr/hdp/current/hive-webhcat/share/webhcat/svr/lib/hive-webhcat-*.jar"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.pig.archive          "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.hive.archive         "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.sqoop.archive        "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.streaming.jar        "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
> 
> 2. Verified properties were saved
> http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=cluster-env
> http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=webhcat-site
> 
> 3. Copy changed files
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/script/config_dictionary.py                   /usr/lib/ambari-server/lib/resource_management/libraries/script/config_dictionary.py
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/version.py                          /usr/lib/ambari-server/lib/resource_management/libraries/functions/version.py
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py  /usr/lib/ambari-server/lib/resource_management/libraries/functions/dynamic_variable_interpretation.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py                /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py                 /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py                    /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/TEZ/package/scripts/params.py
> 
> 4. Check that tarballs are not already in HDFS. If they are, delete them.
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -rm -r /hdp/apps/2.2.0.0-974/
> 
> 5. Before starting WebHCat, check webhcat-site.xml for properties that should be unversioned
> less /etc/hive-webhcat/conf/webhcat-site.xml
> / templeton.*archive
> / templeton.*jar
> 
> 6. Restart WebHCat and verify files are copied to HDFS,
> python /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat_server.py START /var/lib/ambari-agent/data/command-102.json /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS /var/lib/ambari-agent/data/output-102.txt DEBUG /var/lib/ambari-agent/data/tmp
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
> 
> 7. Verify that webhcat-site.xml has properties with actual values this time.
> 
> Check the /etc/hive-webhcat/conf/webhcat-site.xml file again. This time it should have the properties with the versioned paths.
> 
> 
> Thanks,
> 
> Alejandro Fernandez
> 
>


Re: Review Request 26965: WebHCat to support versioned rpms in Ambari

Posted by Nate Cole <nc...@hortonworks.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26965/#review57773
-----------------------------------------------------------

Ship it!


Ship It!

- Nate Cole


On Oct. 21, 2014, 8:54 p.m., Alejandro Fernandez wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/26965/
> -----------------------------------------------------------
> 
> (Updated Oct. 21, 2014, 8:54 p.m.)
> 
> 
> Review request for Ambari, Andrew Onischuk, Dmitro Lisnichenko, Dmytro Sen, Jonathan Hurley, Nate Cole, Sumit Mohanty, Srimanth Gunturi, Sid Wagle, and Yusaku Sako.
> 
> 
> Bugs: AMBARI-7892
>     https://issues.apache.org/jira/browse/AMBARI-7892
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> This is related to AMBARI-7842
> WebHCat relies on the following tarballs/jars
> 
> || File || Property ||
> | pig-*.tar.gz | templeton.pig.archive |
> |hive-*tar.gz | templeton.hive.archive|
> | sqoop-*tar.gz | templeton.sqoop.archive|
> |hadoop-streaming-*.jar | templeton.streaming.jar|
> 
> All of these need to be copied to HDFS, and the name of the file needs to be injected into the property with the fully qualified path in HDFS.
> 
> 
> Diffs
> -----
> 
>   ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py efe7e63 
>   ambari-common/src/main/python/resource_management/libraries/functions/version.py PRE-CREATION 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/configuration/webhcat-site.xml 0523dab 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py 7c86070 
>   ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py 4aad1a2 
>   ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py 71989c3 
>   ambari-server/src/main/resources/stacks/HDP/2.2/configuration/cluster-env.xml cc52fe3 
>   ambari-server/src/main/resources/stacks/HDP/2.2/services/HIVE/configuration/webhcat-site.xml 3435a63 
>   ambari-server/src/test/python/TestVersion.py PRE-CREATION 
>   ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py 5f92a2d 
>   ambari-server/src/test/python/stacks/2.0.6/configs/default.json 5e3bad0 
>   ambari-server/src/test/python/stacks/2.0.6/configs/secured.json d65b0ee 
>   ambari-server/src/test/python/stacks/2.2/configs/default.json ea474e8 
>   ambari-server/src/test/python/stacks/2.2/configs/secured.json 20678fa 
> 
> Diff: https://reviews.apache.org/r/26965/diff/
> 
> 
> Testing
> -------
> 
> Ran ambari-server unit tests,
> ----------------------------------------------------------------------
> Total run:667
> Total errors:0
> Total failures:0
> OK
> 
> And verified on cluster using the following steps.
> 
> 1. Set properties
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_source                         "/usr/hdp/current/hive-client/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_destination_folder             "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_source                          "/usr/hdp/current/pig-client/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_destination_folder              "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_source                        "/usr/hdp/current/sqoop-client/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_destination_folder            "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_source             "/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_destination_folder "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.jar                  "/usr/hdp/current/hive-webhcat/share/webhcat/svr/lib/hive-webhcat-*.jar"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.pig.archive          "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.hive.archive         "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.sqoop.archive        "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
> /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.streaming.jar        "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
> 
> 2. Verified properties were saved
> http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=cluster-env
> http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=webhcat-site
> 
> 3. Copy changed files
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/script/config_dictionary.py                   /usr/lib/ambari-server/lib/resource_management/libraries/script/config_dictionary.py
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/version.py                          /usr/lib/ambari-server/lib/resource_management/libraries/functions/version.py
> yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py  /usr/lib/ambari-server/lib/resource_management/libraries/functions/dynamic_variable_interpretation.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py                /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py                 /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py
> yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py                    /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/TEZ/package/scripts/params.py
> 
> 4. Check that tarballs are not already in HDFS. If they are, delete them.
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -rm -r /hdp/apps/2.2.0.0-974/
> 
> 5. Before starting WebHCat, check webhcat-site.xml for properties that should be unversioned
> less /etc/hive-webhcat/conf/webhcat-site.xml
> / templeton.*archive
> / templeton.*jar
> 
> 6. Restart WebHCat and verify files are copied to HDFS,
> python /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat_server.py START /var/lib/ambari-agent/data/command-102.json /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS /var/lib/ambari-agent/data/output-102.txt DEBUG /var/lib/ambari-agent/data/tmp
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
> [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
> 
> 7. Verify that webhcat-site.xml has properties with actual values this time.
> 
> Check the /etc/hive-webhcat/conf/webhcat-site.xml file again. This time it should have the properties with the versioned paths.
> 
> 
> Thanks,
> 
> Alejandro Fernandez
> 
>


Re: Review Request 26965: WebHCat to support versioned rpms in Ambari

Posted by Alejandro Fernandez <af...@hortonworks.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26965/
-----------------------------------------------------------

(Updated Oct. 22, 2014, 12:54 a.m.)


Review request for Ambari, Andrew Onischuk, Dmitro Lisnichenko, Dmytro Sen, Jonathan Hurley, Nate Cole, Sumit Mohanty, Srimanth Gunturi, Sid Wagle, and Yusaku Sako.


Bugs: AMBARI-7892
    https://issues.apache.org/jira/browse/AMBARI-7892


Repository: ambari


Description
-------

This is related to AMBARI-7842
WebHCat relies on the following tarballs/jars

|| File || Property ||
| pig-*.tar.gz | templeton.pig.archive |
|hive-*tar.gz | templeton.hive.archive|
| sqoop-*tar.gz | templeton.sqoop.archive|
|hadoop-streaming-*.jar | templeton.streaming.jar|

All of these need to be copied to HDFS, and the name of the file needs to be injected into the property with the fully qualified path in HDFS.


Diffs
-----

  ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py efe7e63 
  ambari-common/src/main/python/resource_management/libraries/functions/version.py PRE-CREATION 
  ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/configuration/webhcat-site.xml 0523dab 
  ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py 7c86070 
  ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py 4aad1a2 
  ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py 71989c3 
  ambari-server/src/main/resources/stacks/HDP/2.2/configuration/cluster-env.xml cc52fe3 
  ambari-server/src/main/resources/stacks/HDP/2.2/services/HIVE/configuration/webhcat-site.xml 3435a63 
  ambari-server/src/test/python/TestVersion.py PRE-CREATION 
  ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py 5f92a2d 
  ambari-server/src/test/python/stacks/2.0.6/configs/default.json 5e3bad0 
  ambari-server/src/test/python/stacks/2.0.6/configs/secured.json d65b0ee 
  ambari-server/src/test/python/stacks/2.2/configs/default.json ea474e8 
  ambari-server/src/test/python/stacks/2.2/configs/secured.json 20678fa 

Diff: https://reviews.apache.org/r/26965/diff/


Testing
-------

Ran ambari-server unit tests,
----------------------------------------------------------------------
Total run:667
Total errors:0
Total failures:0
OK

And verified on cluster using the following steps.

1. Set properties
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_source                         "/usr/hdp/current/hive-client/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_destination_folder             "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_source                          "/usr/hdp/current/pig-client/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_destination_folder              "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_source                        "/usr/hdp/current/sqoop-client/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_destination_folder            "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_source             "/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_destination_folder "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.jar                  "/usr/hdp/current/hive-webhcat/share/webhcat/svr/lib/hive-webhcat-*.jar"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.pig.archive          "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.hive.archive         "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.sqoop.archive        "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.streaming.jar        "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"

2. Verified properties were saved
http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=cluster-env
http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=webhcat-site

3. Copy changed files
yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/script/config_dictionary.py                   /usr/lib/ambari-server/lib/resource_management/libraries/script/config_dictionary.py
yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/version.py                          /usr/lib/ambari-server/lib/resource_management/libraries/functions/version.py
yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py  /usr/lib/ambari-server/lib/resource_management/libraries/functions/dynamic_variable_interpretation.py
yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py                /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py
yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py                 /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py
yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py                    /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/TEZ/package/scripts/params.py

4. Check that tarballs are not already in HDFS. If they are, delete them.
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -rm -r /hdp/apps/2.2.0.0-974/

5. Before starting WebHCat, check webhcat-site.xml for properties that should be unversioned
less /etc/hive-webhcat/conf/webhcat-site.xml
/ templeton.*archive
/ templeton.*jar

6. Restart WebHCat and verify files are copied to HDFS,
python /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat_server.py START /var/lib/ambari-agent/data/command-102.json /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS /var/lib/ambari-agent/data/output-102.txt DEBUG /var/lib/ambari-agent/data/tmp
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/

7. Verify that webhcat-site.xml has properties with actual values this time.

Check the /etc/hive-webhcat/conf/webhcat-site.xml file again. This time it should have the properties with the versioned paths.


Thanks,

Alejandro Fernandez


Re: Review Request 26965: WebHCat to support versioned rpms in Ambari

Posted by Alejandro Fernandez <af...@hortonworks.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26965/
-----------------------------------------------------------

(Updated Oct. 22, 2014, 12:09 a.m.)


Review request for Ambari, Andrew Onischuk, Dmitro Lisnichenko, Dmytro Sen, Jonathan Hurley, Sumit Mohanty, and Sid Wagle.


Bugs: AMBARI-7892
    https://issues.apache.org/jira/browse/AMBARI-7892


Repository: ambari


Description
-------

This is related to AMBARI-7842
WebHCat relies on the following tarballs/jars

|| File || Property ||
| pig-*.tar.gz | templeton.pig.archive |
|hive-*tar.gz | templeton.hive.archive|
| sqoop-*tar.gz | templeton.sqoop.archive|
|hadoop-streaming-*.jar | templeton.streaming.jar|

All of these need to be copied to HDFS, and the name of the file needs to be injected into the property with the fully qualified path in HDFS.


Diffs
-----

  ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py efe7e63 
  ambari-common/src/main/python/resource_management/libraries/functions/version.py PRE-CREATION 
  ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/configuration/webhcat-site.xml 0523dab 
  ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py 7c86070 
  ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py 4aad1a2 
  ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py 71989c3 
  ambari-server/src/main/resources/stacks/HDP/2.2/configuration/cluster-env.xml cc52fe3 
  ambari-server/src/main/resources/stacks/HDP/2.2/services/HIVE/configuration/webhcat-site.xml 3435a63 
  ambari-server/src/test/python/TestVersion.py PRE-CREATION 
  ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py 5f92a2d 
  ambari-server/src/test/python/stacks/2.0.6/configs/default.json 5e3bad0 
  ambari-server/src/test/python/stacks/2.0.6/configs/secured.json d65b0ee 
  ambari-server/src/test/python/stacks/2.2/configs/default.json ea474e8 
  ambari-server/src/test/python/stacks/2.2/configs/secured.json 20678fa 

Diff: https://reviews.apache.org/r/26965/diff/


Testing
-------

Ran ambari-server unit tests,
----------------------------------------------------------------------
Total run:667
Total errors:0
Total failures:0
OK

And verified on cluster using the following steps.

1. Set properties
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_source                         "/usr/hdp/current/hive-client/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_destination_folder             "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_source                          "/usr/hdp/current/pig-client/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_destination_folder              "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_source                        "/usr/hdp/current/sqoop-client/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_destination_folder            "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_source             "/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_destination_folder "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.jar                  "/usr/hdp/current/hive-webhcat/share/webhcat/svr/lib/hive-webhcat-*.jar"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.pig.archive          "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.hive.archive         "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.sqoop.archive        "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.streaming.jar        "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"

2. Verified properties were saved
http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=cluster-env
http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=webhcat-site

3. Copy changed files
yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/script/config_dictionary.py                   /usr/lib/ambari-server/lib/resource_management/libraries/script/config_dictionary.py
yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/version.py                          /usr/lib/ambari-server/lib/resource_management/libraries/functions/version.py
yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py  /usr/lib/ambari-server/lib/resource_management/libraries/functions/dynamic_variable_interpretation.py
yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py                /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py
yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py                 /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py
yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py                    /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/TEZ/package/scripts/params.py

4. Check that tarballs are not already in HDFS. If they are, delete them.
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -rm -r /hdp/apps/2.2.0.0-974/

5. Before starting WebHCat, check webhcat-site.xml for properties that should be unversioned
less /etc/hive-webhcat/conf/webhcat-site.xml
/ templeton.*archive
/ templeton.*jar

6. Restart WebHCat and verify files are copied to HDFS,
python /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat_server.py START /var/lib/ambari-agent/data/command-102.json /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS /var/lib/ambari-agent/data/output-102.txt DEBUG /var/lib/ambari-agent/data/tmp
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/

7. Verify that webhcat-site.xml has properties with actual values this time.

Check the /etc/hive-webhcat/conf/webhcat-site.xml file again. This time it should have the properties with the versioned paths.


Thanks,

Alejandro Fernandez


Re: Review Request 26965: WebHCat to support versioned rpms in Ambari

Posted by Alejandro Fernandez <af...@hortonworks.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/26965/
-----------------------------------------------------------

(Updated Oct. 22, 2014, 12:02 a.m.)


Review request for Ambari, Dmytro Sen, Sumit Mohanty, and Sid Wagle.


Bugs: AMBARI-7892
    https://issues.apache.org/jira/browse/AMBARI-7892


Repository: ambari


Description
-------

This is related to AMBARI-7842
WebHCat relies on the following tarballs/jars

|| File || Property ||
| pig-*.tar.gz | templeton.pig.archive |
|hive-*tar.gz | templeton.hive.archive|
| sqoop-*tar.gz | templeton.sqoop.archive|
|hadoop-streaming-*.jar | templeton.streaming.jar|

All of these need to be copied to HDFS, and the name of the file needs to be injected into the property with the fully qualified path in HDFS.


Diffs (updated)
-----

  ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py efe7e63 
  ambari-common/src/main/python/resource_management/libraries/functions/version.py PRE-CREATION 
  ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/configuration/webhcat-site.xml 0523dab 
  ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py 7c86070 
  ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py 4aad1a2 
  ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py 71989c3 
  ambari-server/src/main/resources/stacks/HDP/2.2/configuration/cluster-env.xml cc52fe3 
  ambari-server/src/main/resources/stacks/HDP/2.2/services/HIVE/configuration/webhcat-site.xml 3435a63 
  ambari-server/src/test/python/TestVersion.py PRE-CREATION 
  ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py 5f92a2d 
  ambari-server/src/test/python/stacks/2.0.6/configs/default.json 5e3bad0 
  ambari-server/src/test/python/stacks/2.0.6/configs/secured.json d65b0ee 
  ambari-server/src/test/python/stacks/2.2/configs/default.json ea474e8 
  ambari-server/src/test/python/stacks/2.2/configs/secured.json 20678fa 

Diff: https://reviews.apache.org/r/26965/diff/


Testing
-------

Ran ambari-server unit tests,
----------------------------------------------------------------------
Total run:667
Total errors:0
Total failures:0
OK

And verified on cluster using the following steps.

1. Set properties
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_source                         "/usr/hdp/current/hive-client/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_destination_folder             "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_source                          "/usr/hdp/current/pig-client/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_destination_folder              "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_source                        "/usr/hdp/current/sqoop-client/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_destination_folder            "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_source             "/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_destination_folder "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.jar                  "/usr/hdp/current/hive-webhcat/share/webhcat/svr/lib/hive-webhcat-*.jar"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.pig.archive          "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.hive.archive         "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.sqoop.archive        "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz"
/var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.streaming.jar        "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar"

2. Verified properties were saved
http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=cluster-env
http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=webhcat-site

3. Copy changed files
yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/script/config_dictionary.py                   /usr/lib/ambari-server/lib/resource_management/libraries/script/config_dictionary.py
yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/version.py                          /usr/lib/ambari-server/lib/resource_management/libraries/functions/version.py
yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py  /usr/lib/ambari-server/lib/resource_management/libraries/functions/dynamic_variable_interpretation.py
yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py                /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py
yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py                 /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py
yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py                    /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/TEZ/package/scripts/params.py

4. Check that tarballs are not already in HDFS. If they are, delete them.
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -rm -r /hdp/apps/2.2.0.0-974/

5. Before starting WebHCat, check webhcat-site.xml for properties that should be unversioned
less /etc/hive-webhcat/conf/webhcat-site.xml
/ templeton.*archive
/ templeton.*jar

6. Restart WebHCat and verify files are copied to HDFS,
python /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat_server.py START /var/lib/ambari-agent/data/command-102.json /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS /var/lib/ambari-agent/data/output-102.txt DEBUG /var/lib/ambari-agent/data/tmp
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/
[hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/

7. Verify that webhcat-site.xml has properties with actual values this time.

Check the /etc/hive-webhcat/conf/webhcat-site.xml file again. This time it should have the properties with the versioned paths.


Thanks,

Alejandro Fernandez