You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hive.apache.org by th...@apache.org on 2014/05/02 22:02:01 UTC

svn commit: r1592025 - in /hive/trunk/hcatalog: src/test/e2e/templeton/ src/test/e2e/templeton/deployers/ src/test/e2e/templeton/deployers/config/ src/test/e2e/templeton/deployers/config/hive/ src/test/e2e/templeton/deployers/config/webhcat/ webhcat/sv...

Author: thejas
Date: Fri May  2 20:02:01 2014
New Revision: 1592025

URL: http://svn.apache.org/r1592025
Log:
HIVE-6946 : Make it easier to run WebHCat e2e tests (Eugene Koifman via Thejas Nair)

Added:
    hive/trunk/hcatalog/src/test/e2e/templeton/deployers/
    hive/trunk/hcatalog/src/test/e2e/templeton/deployers/README.txt
    hive/trunk/hcatalog/src/test/e2e/templeton/deployers/clean_file_system.sh
    hive/trunk/hcatalog/src/test/e2e/templeton/deployers/config/
    hive/trunk/hcatalog/src/test/e2e/templeton/deployers/config/hive/
    hive/trunk/hcatalog/src/test/e2e/templeton/deployers/config/hive/hive-site.mssql.xml
    hive/trunk/hcatalog/src/test/e2e/templeton/deployers/config/hive/hive-site.xml
    hive/trunk/hcatalog/src/test/e2e/templeton/deployers/config/webhcat/
    hive/trunk/hcatalog/src/test/e2e/templeton/deployers/config/webhcat/webhcat-site.xml
    hive/trunk/hcatalog/src/test/e2e/templeton/deployers/deploy_e2e_artifacts.sh
    hive/trunk/hcatalog/src/test/e2e/templeton/deployers/env.sh
    hive/trunk/hcatalog/src/test/e2e/templeton/deployers/restart_hive_redeploy_artifacts.sh
    hive/trunk/hcatalog/src/test/e2e/templeton/deployers/start_hive_services.sh
    hive/trunk/hcatalog/src/test/e2e/templeton/deployers/stop_hive_services.sh
Modified:
    hive/trunk/hcatalog/src/test/e2e/templeton/README.txt
    hive/trunk/hcatalog/src/test/e2e/templeton/build.xml
    hive/trunk/hcatalog/webhcat/svr/pom.xml
    hive/trunk/hcatalog/webhcat/svr/src/main/bin/webhcat_server.sh

Modified: hive/trunk/hcatalog/src/test/e2e/templeton/README.txt
URL: http://svn.apache.org/viewvc/hive/trunk/hcatalog/src/test/e2e/templeton/README.txt?rev=1592025&r1=1592024&r2=1592025&view=diff
==============================================================================
--- hive/trunk/hcatalog/src/test/e2e/templeton/README.txt (original)
+++ hive/trunk/hcatalog/src/test/e2e/templeton/README.txt Fri May  2 20:02:01 2014
@@ -19,11 +19,13 @@ End to end tests
 ---------------
 End to end tests in templeton runs tests against an existing templeton server.
 It runs hcat, mapreduce, streaming, hive and pig tests.
+This requires Hadoop cluster and Hive metastore running.
 
 It's a good idea to look at current versions of
-http://hive.apache.org/docs/hcat_r0.5.0/rest_server_install.html and
-http://hive.apache.org/docs/hcat_r0.5.0/configuration.html before proceeding.
+https://cwiki.apache.org/confluence/display/Hive/WebHCat+InstallWebHCat and 
+https://cwiki.apache.org/confluence/display/Hive/WebHCat+Configure
 
+See deployers/README.txt for help automating some of the steps in this document.
 
 (Note that by default, webhcat-default.xml templeton.hive.properties sets
 hive.metastore.uris=thrift://localhost:9933, thus WebHCat will expect
@@ -39,6 +41,11 @@ to control which DB the metastore uses p
   <description>Controls which DB engine metastore will use for persistence. In particular,
   where Derby will create it's data files.</description>
 </property>
+<property>
+  <name>hive.metastore.uris</name>
+  <value>thrift://localhost:9933</value>
+  <description>For Hive CLI to connect to</description>
+</property>
 
 in hive-site.xml
 )
@@ -91,13 +98,36 @@ Tips:
 be obtained from Pig and the other two are obtained from your Hadoop distribution.
 For Hadoop 1.x you would need to upload hadoop-examples.jar twice to HDFS one as hclient.jar and other as hexamples.jar.
 For Hadoop 2.x you would need to upload hadoop-mapreduce-client-jobclient.jar to HDFS as hclient.jar and hadoop-mapreduce-examples.jar to HDFS as hexamples.jar. 
-Also see http://hive.apache.org/docs/hcat_r0.5.0/rest_server_install.html#Hadoop+Distributed+Cache for notes on
-additional JAR files to copy to HDFS.
+Also see https://cwiki.apache.org/confluence/display/Hive/WebHCat+InstallWebHCat#WebHCatInstallWebHCat-HadoopDistributedCache
+ for notes on additional JAR files to copy to HDFS.
 
 5. Make sure TEMPLETON_HOME evnironment variable is set
 
 6. hadoop/conf/core-site.xml should have items described in
-http://hive.apache.org/docs/hcat_r0.5.0/rest_server_install.html#Permissions
+https://cwiki.apache.org/confluence/display/Hive/WebHCat+InstallWebHCat#WebHCatInstallWebHCat-Permissions
+
+7. Currently Pig tar file available on http://pig.apache.org/ contains jar files compiled to work with Hadoop 1.x.
+To run WebHCat tests on Hadoop 2.x you need to build your own Pig tar for Hadoop 2. To do that download the 
+Pig source distribution and build it with "ant -Dforrest.home=$FORREST_HOME -Dhadoopversion=23 clean tar"
+You may also need to adjust the following in Pig's build.xml as needed:
+<property name="pig.version" value="0.12.1" />
+<property name="pig.version.suffix" value="-SNAPSHOT" />
+
+8. Enable webhdfs by adding the following to your hadoop hdfs-site.xml :
+<property>
+  <name>dfs.webhdfs.enabled</name>
+  <value>true</value>
+</property>
+<property>
+  <name>dfs.http.address</name>
+  <value>127.0.0.1:8085</value>
+  <final>true</final>
+</property>
+
+****
+**** See deployers/ for scripts that automate a lot of the set up.
+****
+
 
 Running the tests
 -----------------
@@ -174,25 +204,6 @@ and the folder hdfs://hostname:8020/sqoo
 
 Notes
 -----
-
-
-
-Enable webhdfs by adding the following to your hadoop hdfs-site.xml :
-
-<property>
-  <name>dfs.webhdfs.enabled</name>
-  <value>true</value>
-</property>
-<property>
-  <name>dfs.http.address</name>
-  <value>127.0.0.1:8085</value>
-  <final>true</final>
-</property>
-
-You can build a server that will measure test coverage by using templeton:
-ant clean; ant e2e
-This assumes you've got webhdfs at the address above, the inpdir info in /user/templeton, and templeton running on the default port.  You can change any of those properties in the build file.
-
 It's best to set HADOOP_HOME_WARN_SUPPRESS=true everywhere you can.
 Also useful to add to conf/hadoop-env.sh
 export HADOOP_OPTS="-Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"

Modified: hive/trunk/hcatalog/src/test/e2e/templeton/build.xml
URL: http://svn.apache.org/viewvc/hive/trunk/hcatalog/src/test/e2e/templeton/build.xml?rev=1592025&r1=1592024&r2=1592025&view=diff
==============================================================================
--- hive/trunk/hcatalog/src/test/e2e/templeton/build.xml (original)
+++ hive/trunk/hcatalog/src/test/e2e/templeton/build.xml Fri May  2 20:02:01 2014
@@ -87,8 +87,8 @@
              if group=3, then 3 .conf files will be processed in parallel
              if conf.file=2 there will be 2 thread per .conf file, each thread 
              executing a single group (identified by 'name' element) -->
-        <property name="fork.factor.group" value="5"/>
-        <property name="fork.factor.conf.file" value="5"/>
+        <property name="fork.factor.group" value="3"/>
+        <property name="fork.factor.conf.file" value="3"/>
         <property name="e2e.debug" value="false"/>
         <property name="tests.to.run" value=""/>
 

Added: hive/trunk/hcatalog/src/test/e2e/templeton/deployers/README.txt
URL: http://svn.apache.org/viewvc/hive/trunk/hcatalog/src/test/e2e/templeton/deployers/README.txt?rev=1592025&view=auto
==============================================================================
--- hive/trunk/hcatalog/src/test/e2e/templeton/deployers/README.txt (added)
+++ hive/trunk/hcatalog/src/test/e2e/templeton/deployers/README.txt Fri May  2 20:02:01 2014
@@ -0,0 +1,55 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+Overview
+This directory contains a set of scripts that make running WebHCat e2e tests easier.  These scripts
+help ensure that all the necessary artifacts for e2e tests are deployed to the cluster and speed up
+code-compile-test loop in Hive/WebHCat.
+
+
+Assumptions
+It is assumed that you have a properly set up Hadoop2 cluster running.
+
+High level workflow
+1. Build Hive (e.g. mvn clean package -Phadoop-2,dist -DskipTests)
+2. Define variables in env.sh.  This should be the only file you must change.
+3. Run restart_hive_redeploy_artifacts.sh, which will 
+    a. Stop Hive Metastore, WebHCat server
+    b. Delete dirs in HDFS which may be there from previous runs.  Currently this is used with a
+      cluster that is only used for WebHCat e2e tests so make sure to see what this will delete if
+      the cluster is used for something else.
+    c. Copy hive-site.xml and webhcat-site.xml under HIVE_HOME with minimal config needed to start
+      the services and run e2e tests.
+    d. Start Hive Metastore and WebHCat servers
+    e. Copy various artifacts to HDFS as explained in e2e/templeton/README.txt.
+4. Now you can run the test command as explained in e2e/templeton/README.txt.
+
+
+If you would like to make this better (in no particular order):
+1. env.sh is sourced from all other scripts but only work if the 'other' script is called from
+   deployers/.
+2. send 'derby.log' somewhere in /tmp/
+3. some tests (e.g. Sqoop) require an RDMBS set up with resources pre-created.  See if this can
+   be automated.
+  (At least truncating the table between runs).
+4. Make the same work on Windows (w/o making a copy of each .sh file if at all possible)
+5. Configure a working (even pseudo-dist) Hadoop-2 cluster takes some knowledge.  It may be to
+   script taking of Hadoop binary tar file, exploding it and copying a few pre-canned config files
+   to it (mapred-site, yarn-site, etc) to make sure the can easily set up a test env.
+6. Make this set of scripts work with Hadoop-1 (should not take much effort, if any).

Added: hive/trunk/hcatalog/src/test/e2e/templeton/deployers/clean_file_system.sh
URL: http://svn.apache.org/viewvc/hive/trunk/hcatalog/src/test/e2e/templeton/deployers/clean_file_system.sh?rev=1592025&view=auto
==============================================================================
--- hive/trunk/hcatalog/src/test/e2e/templeton/deployers/clean_file_system.sh (added)
+++ hive/trunk/hcatalog/src/test/e2e/templeton/deployers/clean_file_system.sh Fri May  2 20:02:01 2014
@@ -0,0 +1,33 @@
+#!/usr/bin/env bash
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+
+
+# This script deletes things from DFS that may have been left over from previous runs of e2e
+# tests to make sure next run starts with a clean slate.
+
+. ./env.sh
+
+echo "Deleting artifacts from HDFS..."
+
+${HADOOP_HOME}/bin/hdfs dfs -rm -r       /user/hive/ /user/${USER}/ /user/templeton /apps /tmp /sqoopoutputdir
+${HADOOP_HOME}/bin/hdfs dfs -mkdir -p    /tmp/hadoop-${USER} /user/hive/warehouse /user/${USER}/ /user/templeton /apps/templeton /tmp/hadoop-yarn /tmp/templeton_test_out
+${HADOOP_HOME}/bin/hdfs dfs -chmod -R a+rwx /user /tmp/
+${HADOOP_HOME}/bin/hdfs dfs -chmod g+rwx   /user/hive/warehouse

Added: hive/trunk/hcatalog/src/test/e2e/templeton/deployers/config/hive/hive-site.mssql.xml
URL: http://svn.apache.org/viewvc/hive/trunk/hcatalog/src/test/e2e/templeton/deployers/config/hive/hive-site.mssql.xml?rev=1592025&view=auto
==============================================================================
--- hive/trunk/hcatalog/src/test/e2e/templeton/deployers/config/hive/hive-site.mssql.xml (added)
+++ hive/trunk/hcatalog/src/test/e2e/templeton/deployers/config/hive/hive-site.mssql.xml Fri May  2 20:02:01 2014
@@ -0,0 +1,56 @@
+<?xml version="1.0"?>
+<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
+<!--
+    Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+        http://www.apache.org/licenses/LICENSE-2.0
+
+    Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+-->
+
+<configuration>
+
+    <!-- Hive Configuration can either be stored in this file or in the hadoop configuration files  -->
+    <!-- that are implied by Hadoop setup variables.                                                -->
+    <!-- Aside from Hadoop setup variables - this file is provided as a convenience so that Hive    -->
+    <!-- users do not have to edit hadoop configuration files (that may be managed as a centralized -->
+    <!-- resource).                                                                                 -->
+
+    <!-- Hive Execution Parameters -->
+    <property>
+        <name>javax.jdo.option.ConnectionURL</name>
+        <value>jdbc:sqlserver://172.16.65.152:1433;databaseName=master</value>
+    </property>
+    <property>
+        <name>javax.jdo.option.ConnectionUserName</name>
+        <value>hive</value>
+    </property>
+    <property>
+        <name>javax.jdo.option.ConnectionPassword</name>
+        <value>hive</value>
+    </property>
+    <property>
+        <name>javax.jdo.option.ConnectionDriverName</name>
+        <value>com.microsoft.sqlserver.jdbc.SQLServerDriver</value>
+    </property>
+    <property>
+        <name>datanucleus.autoCreateSchema</name>
+        <value>false</value>
+    </property>
+    <!--
+    can be used by DataNucleus (have not tried)
+    javax.jdo.mapping.Catalog={the_catalog_name}
+    javax.jdo.mapping.Schema={the_schema_name}
+    -->
+</configuration>

Added: hive/trunk/hcatalog/src/test/e2e/templeton/deployers/config/hive/hive-site.xml
URL: http://svn.apache.org/viewvc/hive/trunk/hcatalog/src/test/e2e/templeton/deployers/config/hive/hive-site.xml?rev=1592025&view=auto
==============================================================================
--- hive/trunk/hcatalog/src/test/e2e/templeton/deployers/config/hive/hive-site.xml (added)
+++ hive/trunk/hcatalog/src/test/e2e/templeton/deployers/config/hive/hive-site.xml Fri May  2 20:02:01 2014
@@ -0,0 +1,87 @@
+<?xml version="1.0"?>
+<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
+<!--
+    Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+        http://www.apache.org/licenses/LICENSE-2.0
+
+    Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+-->
+
+<configuration>
+    <property>
+        <name>javax.jdo.option.ConnectionURL</name>
+        <!--databaseName should match $METASTORE_DB in ../../env.sh-->
+        <value>jdbc:derby:;databaseName=/tmp/webhcat_e2e/logs/webhcat_e2e_metastore_db;create=true</value>
+    </property>
+
+    <property>
+        <name>hive.metastore.uris</name>
+        <value>thrift://localhost:9933</value>
+        <description>For Hive CLI to connect to</description>
+    </property>
+    <!--
+    This enables client side Hive (RDBMS style) security
+    <property>
+        <name>hive.security.authorization.enabled</name>
+        <value>true</value>
+        <description>enable or disable the hive client authorization</description>
+    </property>
+    <property>
+        <name>hive.security.authorization.createtable.owner.grants</name>
+        <value>ALL</value>
+        <description>the privileges automatically granted to the owner whenever a table gets created.
+            An example like "select,drop" will grant select and drop privilege to the owner of the table
+        </description>
+    </property>
+    -->
+    <!--
+    enable file based auth for Hive on metastore side, i.e. enforce metadata 
+    security as if it were stored together with data
+    https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Authorization
+    <property>
+        <name>hive.metastore.execute.setugi</name>
+        <value>true</value>
+        <description>Make metastore execute actions as doAs
+            (rather than the UID which owns metastore process)
+        </description>
+    </property>
+    <property>
+        <name>hive.security.metastore.authorization.manager</name>
+        <value>org.apache.hadoop.hive.ql.security.authorization.StorageBasedAuthorizationProvider</value>
+        <description>the hive client authorization manager class name.
+            The user defined authorization class should implement interface
+            org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProvider.
+        </description>
+    </property>
+    <property>
+        <name>hive.security.metastore.authenticator.manager</name>
+        <value>org.apache.hadoop.hive.ql.security.HadoopDefaultMetastoreAuthenticator</value>
+        <description>authenticator manager class name to be used in the metastore for authentication.
+            The user defined authenticator should implement interface
+            org.apache.hadoop.hive.ql.security.HiveAuthenticationProvider.
+        </description>
+    </property>
+    <property>
+        <name>hive.metastore.pre.event.listeners</name>
+        <value>org.apache.hadoop.hive.ql.security.authorization.AuthorizationPreEventListener</value>
+        <description>pre-event listener classes to be loaded on the metastore side to run code
+            whenever databases, tables, and partitions are created, altered, or dropped.
+            Set to org.apache.hadoop.hive.ql.security.authorization.AuthorizationPreEventListener
+            if metastore-side authorization is desired.
+        </description>
+    </property>
+    -->
+       
+</configuration>

Added: hive/trunk/hcatalog/src/test/e2e/templeton/deployers/config/webhcat/webhcat-site.xml
URL: http://svn.apache.org/viewvc/hive/trunk/hcatalog/src/test/e2e/templeton/deployers/config/webhcat/webhcat-site.xml?rev=1592025&view=auto
==============================================================================
--- hive/trunk/hcatalog/src/test/e2e/templeton/deployers/config/webhcat/webhcat-site.xml (added)
+++ hive/trunk/hcatalog/src/test/e2e/templeton/deployers/config/webhcat/webhcat-site.xml Fri May  2 20:02:01 2014
@@ -0,0 +1,134 @@
+<?xml version="1.0" encoding="UTF-8"?>
+
+<!--
+    Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+        http://www.apache.org/licenses/LICENSE-2.0
+
+    Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+-->
+
+<!-- The default settings for Templeton. -->
+<!-- Edit templeton-site.xml to change settings for your local -->
+<!-- install. -->
+
+<configuration>
+<!--TODO:
+1. make pig/hive versions env variables-->
+
+  <property>
+    <name>templeton.hcat</name>
+    <value>${env.HCAT_PREFIX}/bin/hcat</value>
+    <description>The path to the hcatalog executable.</description>
+  </property>
+
+    <property>
+        <name>templeton.libjars</name>
+        <value>${env.TEMPLETON_HOME}/../lib/zookeeper-3.4.5.jar</value>
+        <description>Jars to add to the classpath.</description>
+    </property>
+
+    <property>
+        <name>templeton.pig.archive</name>
+        <value>hdfs:///apps/templeton/pig-${env.PIG_VERSION}.tar.gz</value>
+        <description>The path to the Pig archive.</description>
+    </property>
+    <property>
+        <name>templeton.pig.path</name>
+        <value>pig-${env.PIG_VERSION}.tar.gz/pig-${env.PIG_VERSION}/bin/pig</value>
+        <description>The path to the Pig executable.</description>
+    </property>
+    <property>
+        <name>templeton.hive.archive</name>
+        <value>hdfs:///apps/templeton/apache-hive-${env.HIVE_VERSION}-bin.tar.gz</value>
+        <description>The path to the Hive archive.</description>
+    </property>
+
+    <property>
+        <name>templeton.hive.path</name>
+        <value>apache-hive-${env.HIVE_VERSION}-bin.tar.gz/apache-hive-${env.HIVE_VERSION}-bin/bin/hive</value>
+        <description>The path to the Hive executable.</description>
+    </property>
+
+    <property>
+        <name>templeton.hive.home</name>
+        <value>apache-hive-${env.HIVE_VERSION}-bin.tar.gz/apache-hive-${env.HIVE_VERSION}-bin</value>
+        <description>The path to the Hive home within the tar.  This is needed if 
+            Hive is not installed on all nodes in the cluster and needs to be 
+            shipped to the target node in the cluster to execute Pig job which uses 
+            HCat, Hive query, etc.</description>
+    </property>
+    <property>
+        <name>templeton.hcat.home</name>
+        <value>apache-hive-${env.HIVE_VERSION}-bin.tar.gz/apache-hive-${env.HIVE_VERSION}-bin/hcatalog</value>
+        <description>The path to the HCat home within the tar.  This is needed if
+            Hive is not installed on all nodes in the cluster and needs to be
+            shipped to the target node in the cluster to execute Pig job which uses
+            HCat, Hive query, etc.</description>
+    </property>
+
+    <property>
+        <name>templeton.controller.mr.child.opts</name>
+        <value> -Xmx64m -Djava.net.preferIPv4Stack=true</value>
+    </property>
+
+    <property>
+        <!--\,thrift://127.0.0.1:9933-->
+        <name>templeton.hive.properties</name>
+        <value>hive.metastore.uris=thrift://localhost:9933,hive.metastore.sasl.enabled=false</value>
+    </property>
+
+<!--
+    <property>
+        <name>webhcat.proxyuser.hue.hosts</name>
+        <value>localhost</value>
+        <description>
+            List of hosts the '#USER#' user is allowed to perform 'doAs'
+            operations.
+
+            The '#USER#' must be replaced with the username o the user who is
+            allowed to perform 'doAs' operations.
+
+            The value can be the '*' wildcard, which means every host is allowed,
+            or a comma-separated list of hostnames.
+
+            If value is a blank string or webhcat.proxyuser.#USER#.hosts is missing,
+            no hosts will be allowed.
+
+            For multiple users copy this property and replace the user name
+            in the property name.
+        </description>
+    </property>
+    <property>
+        <name>webhcat.proxyuser.hue.groups</name>
+        <value>staff</value>
+        <description>
+            List of groups the '#USER#' user is allowed to impersonate users
+            from to perform 'doAs' operations.
+
+            The '#USER#' must be replaced with the username o the user who is
+            allowed to perform 'doAs' operations.
+
+            The value can be the '*' wildcard, which means any doAs value is
+            allowed, or a comma-separated list of groups.
+
+            If value is an empty list or webhcat.proxyuser.#USER#.groups is missing,
+            every doAs call value will fail.
+
+            For multiple users copy this property and replace the user name
+            in the property name.
+        </description>
+    </property>
+    -->
+</configuration>

Added: hive/trunk/hcatalog/src/test/e2e/templeton/deployers/deploy_e2e_artifacts.sh
URL: http://svn.apache.org/viewvc/hive/trunk/hcatalog/src/test/e2e/templeton/deployers/deploy_e2e_artifacts.sh?rev=1592025&view=auto
==============================================================================
--- hive/trunk/hcatalog/src/test/e2e/templeton/deployers/deploy_e2e_artifacts.sh (added)
+++ hive/trunk/hcatalog/src/test/e2e/templeton/deployers/deploy_e2e_artifacts.sh Fri May  2 20:02:01 2014
@@ -0,0 +1,53 @@
+#!/usr/bin/env bash
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+
+#This script copies files needed by e2e tests to DFS
+
+source ./env.sh
+
+echo "Deploying artifacts to HDFS..."
+
+${HADOOP_HOME}/bin/hdfs dfs -put ${PROJ_HOME}/hcatalog/src/test/e2e/templeton/inpdir/ webhcate2e
+#For hadoop1 we copy the same file with 2 names
+#$HADOOP_HOME/bin/hadoop fs -put hadoop-examples-1.2.1.jar  webhcate2e/hexamples.jar
+#$HADOOP_HOME/bin/hadoop fs -put hadoop-examples-1.2.1.jar  webhcate2e/hclient.jar
+
+#For hadoop2 there are 2 separate jars
+${HADOOP_HOME}/bin/hdfs dfs -put ${HADOOP_HOME}/share/hadoop/mapreduce/hadoop-mapreduce-examples-${HADOOP_VERSION}.jar  webhcate2e/hexamples.jar
+${HADOOP_HOME}/bin/hdfs dfs -put ${HADOOP_HOME}/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-${HADOOP_VERSION}.jar webhcate2e/hclient.jar
+${HADOOP_HOME}/bin/hdfs dfs -put ${HADOOP_HOME}/share/hadoop/tools/lib/hadoop-streaming-${HADOOP_VERSION}.jar  /user/templeton/hadoop-streaming.jar
+
+
+#must match config/webhcat/webhcat-stie.xml
+${HADOOP_HOME}/bin/hdfs dfs -put ${PROJ_HOME}/packaging/target/apache-hive-${HIVE_VERSION}-bin.tar.gz /apps/templeton/apache-hive-${HIVE_VERSION}-bin.tar.gz
+# To run against Hadoop2 cluster, you have to build Pig tar yourself with 
+# "ant -Dforrest.home=$FORREST_HOME -Dhadoopversion=23 clean tar"
+${HADOOP_HOME}/bin/hadoop fs -put ${PIG_TAR_PATH}/pig-${PIG_VERSION}.tar.gz /apps/templeton/pig-${PIG_VERSION}.tar.gz
+${HADOOP_HOME}/bin/hadoop fs -put ${PIG_PIGGYBANK_PATH}  webhcate2e/
+#standard Pig distro from ASF for Hadoop 1
+# ${HADOOP_HOME}/bin/hadoop fs -put /Users/ekoifman/dev/data/jarsForTmplte2e/pig-0.12.0.tar.gz /apps/templeton/pig-0.12.0.tar.gz
+#${HADOOP_HOME}/bin/hadoop fs -put /Users/ekoifman/dev/data/jarsForTmplte2e/pig-0.12.0/contrib/piggybank/java/piggybank.jar  webhcate2e/
+
+
+${HADOOP_HOME}/bin/hadoop fs -put ${HIVE_HOME}/lib/zookeeper-3.4.5.jar /apps/templeton/zookeeper-3.4.5.jar
+
+#check what got deployed
+${HADOOP_HOME}/bin/hdfs dfs -ls /apps/templeton webhcate2e /user/templeton /user/hive/warehouse

Added: hive/trunk/hcatalog/src/test/e2e/templeton/deployers/env.sh
URL: http://svn.apache.org/viewvc/hive/trunk/hcatalog/src/test/e2e/templeton/deployers/env.sh?rev=1592025&view=auto
==============================================================================
--- hive/trunk/hcatalog/src/test/e2e/templeton/deployers/env.sh (added)
+++ hive/trunk/hcatalog/src/test/e2e/templeton/deployers/env.sh Fri May  2 20:02:01 2014
@@ -0,0 +1,43 @@
+#!/usr/bin/env bash
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+#set -x;
+
+# define necessary env vars here and source it in other files
+
+export HADOOP_VERSION=2.4.1-SNAPSHOT
+export HIVE_VERSION=0.14.0-SNAPSHOT
+export PIG_VERSION=0.12.0
+
+#Root of project source tree
+export PROJ_HOME=/Users/${USER}/dev/hive
+export HIVE_HOME=${PROJ_HOME}/packaging/target/apache-hive-${HIVE_VERSION}-bin/apache-hive-${HIVE_VERSION}-bin
+export HADOOP_HOME=/Users/${USER}/dev/hwxhadoop/hadoop-dist/target/hadoop-${HADOOP_VERSION}
+#export SQOOP_HOME=/
+
+#Make sure Pig is built for the Hadoop version you are running
+export PIG_TAR_PATH=/Users/${USER}/dev/pig-0.12.0-src/build
+#this is part of Pig distribution
+export PIG_PIGGYBANK_PATH=/Users/${USER}/dev/pig-0.12.0-src/build/tar/pig-0.12.0/contrib/piggybank/java/piggybank.jar
+
+export WEBHCAT_LOG_DIR=/tmp/webhcat_e2e/logs
+export WEBHCAT_PID_DIR=${WEBHCAT_LOG_DIR}
+#config/hive/hive-site.xml should match this path - it doesn't understand env vars
+export METASTORE_DB=${WEBHCAT_LOG_DIR}/wehcat_e2e_metastore_db

Added: hive/trunk/hcatalog/src/test/e2e/templeton/deployers/restart_hive_redeploy_artifacts.sh
URL: http://svn.apache.org/viewvc/hive/trunk/hcatalog/src/test/e2e/templeton/deployers/restart_hive_redeploy_artifacts.sh?rev=1592025&view=auto
==============================================================================
--- hive/trunk/hcatalog/src/test/e2e/templeton/deployers/restart_hive_redeploy_artifacts.sh (added)
+++ hive/trunk/hcatalog/src/test/e2e/templeton/deployers/restart_hive_redeploy_artifacts.sh Fri May  2 20:02:01 2014
@@ -0,0 +1,27 @@
+#!/usr/bin/env bash
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+
+#This is convenience complete lifecycle script useful for developers.
+
+./stop_hive_services.sh
+./clean_file_system.sh;
+./deploy_e2e_artifacts.sh;
+./start_hive_services.sh;

Added: hive/trunk/hcatalog/src/test/e2e/templeton/deployers/start_hive_services.sh
URL: http://svn.apache.org/viewvc/hive/trunk/hcatalog/src/test/e2e/templeton/deployers/start_hive_services.sh?rev=1592025&view=auto
==============================================================================
--- hive/trunk/hcatalog/src/test/e2e/templeton/deployers/start_hive_services.sh (added)
+++ hive/trunk/hcatalog/src/test/e2e/templeton/deployers/start_hive_services.sh Fri May  2 20:02:01 2014
@@ -0,0 +1,43 @@
+#!/usr/bin/env bash
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+#This script copies precanned *-site.xml files need to start Hive and WebHCat services, then
+#starts the services
+
+
+source ./env.sh
+
+#decide which DB to run against
+cp ${PROJ_HOME}/hcatalog/src/test/e2e/templeton/deployers/config/hive/hive-site.xml ${HIVE_HOME}/conf/hive-site.xml
+#cp ${PROJ_HOME}/hcatalog/src/test/e2e/templeton/deployers/config/hive/hive-site.mssql.xml ${HIVE_HOME}/conf/hive-site.xml
+
+cp ${PROJ_HOME}/hcatalog/src/test/e2e/templeton/deployers/config/webhcat/webhcat-site.xml ${HIVE_HOME}/hcatalog/etc/webhcat/webhcat-site.xml
+
+if [ -d ${WEBHCAT_LOG_DIR} ]; then
+  rm -Rf ${WEBHCAT_LOG_DIR};
+fi
+mkdir -p ${WEBHCAT_LOG_DIR};
+echo "Starting Metastore..."
+nohup ${HIVE_HOME}/bin/hive --service metastore -p9933 >>${WEBHCAT_LOG_DIR}/metastore_console.log 2>>${WEBHCAT_LOG_DIR}/metastore_error.log &
+echo $! > ${WEBHCAT_LOG_DIR}/metastore.pid
+echo "Starting WebHCat..."
+${HIVE_HOME}/hcatalog/sbin/webhcat_server.sh start
+
+jps;

Added: hive/trunk/hcatalog/src/test/e2e/templeton/deployers/stop_hive_services.sh
URL: http://svn.apache.org/viewvc/hive/trunk/hcatalog/src/test/e2e/templeton/deployers/stop_hive_services.sh?rev=1592025&view=auto
==============================================================================
--- hive/trunk/hcatalog/src/test/e2e/templeton/deployers/stop_hive_services.sh (added)
+++ hive/trunk/hcatalog/src/test/e2e/templeton/deployers/stop_hive_services.sh Fri May  2 20:02:01 2014
@@ -0,0 +1,32 @@
+#!/usr/bin/env bash
+#
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+#
+
+#This script stops the Hive and WebHCat services started by start_hive_services.sh
+ 
+ 
+source ./env.sh
+
+echo "Stopping Metastore...";
+kill `cat ${WEBHCAT_LOG_DIR}/metastore.pid`;
+echo "Stopping WebHCat...";
+${HIVE_HOME}/hcatalog/sbin/webhcat_server.sh stop;
+
+rm ${HIVE_HOME}/conf/hive-site.xml
+rm ${HIVE_HOME}/hcatalog/etc/webhcat/webhcat-site.xml

Modified: hive/trunk/hcatalog/webhcat/svr/pom.xml
URL: http://svn.apache.org/viewvc/hive/trunk/hcatalog/webhcat/svr/pom.xml?rev=1592025&r1=1592024&r2=1592025&view=diff
==============================================================================
--- hive/trunk/hcatalog/webhcat/svr/pom.xml (original)
+++ hive/trunk/hcatalog/webhcat/svr/pom.xml Fri May  2 20:02:01 2014
@@ -141,6 +141,12 @@
   </profiles>
 
   <build>
+      <resources>
+          <resource>
+              <targetPath>.</targetPath>
+              <directory>src/main/config</directory>
+          </resource>
+      </resources>
     <plugins>
       <plugin>
         <groupId>org.apache.maven.plugins</groupId>

Modified: hive/trunk/hcatalog/webhcat/svr/src/main/bin/webhcat_server.sh
URL: http://svn.apache.org/viewvc/hive/trunk/hcatalog/webhcat/svr/src/main/bin/webhcat_server.sh?rev=1592025&r1=1592024&r2=1592025&view=diff
==============================================================================
--- hive/trunk/hcatalog/webhcat/svr/src/main/bin/webhcat_server.sh (original)
+++ hive/trunk/hcatalog/webhcat/svr/src/main/bin/webhcat_server.sh Fri May  2 20:02:01 2014
@@ -38,8 +38,9 @@ function real_script_name() {
 }
 
 function usage() {
-        echo "usage: $0 [start|stop|foreground]"
+        echo "usage: $0 [start|startDebug|stop|foreground]"
         echo "  start           Start the Webhcat Server"
+        echo "  startDebug      Start the Webhcat Server listening for debugger on port 5005"
         echo "  stop            Stop the Webhcat Server"
         echo "  foreground      Run the Webhcat Server in the foreground"
         exit 1
@@ -225,6 +226,10 @@ case $cmd in
         start)
                 start_webhcat
                 ;;
+        startDebug)
+                export HADOOP_OPTS="${HADOOP_OPTS} -agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=5005"
+                start_webhcat
+                ;;
         stop)
                 stop_webhcat
                 ;;