You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-commits@hadoop.apache.org by to...@apache.org on 2010/02/12 06:16:33 UTC

svn commit: r909237 - in /hadoop/mapreduce/trunk: ./ src/contrib/sqoop/doc/ src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/ src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/hive/ src/contrib/sqoop/src/test/org/apache/hadoop/sqoop/hive/ src/contri...

Author: tomwhite
Date: Fri Feb 12 05:16:32 2010
New Revision: 909237

URL: http://svn.apache.org/viewvc?rev=909237&view=rev
Log:
MAPREDUCE-1341. Sqoop should have an option to create hive tables and skip the table import step. Contributed by Leonid Furman.

Added:
    hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/createOnlyImport.q
    hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/createOverwriteImport.q
Modified:
    hadoop/mapreduce/trunk/CHANGES.txt
    hadoop/mapreduce/trunk/src/contrib/sqoop/doc/Sqoop-manpage.txt
    hadoop/mapreduce/trunk/src/contrib/sqoop/doc/hive.txt
    hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/Sqoop.java
    hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/SqoopOptions.java
    hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/hive/HiveImport.java
    hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/hive/TableDefWriter.java
    hadoop/mapreduce/trunk/src/contrib/sqoop/src/test/org/apache/hadoop/sqoop/hive/TestHiveImport.java
    hadoop/mapreduce/trunk/src/contrib/sqoop/src/test/org/apache/hadoop/sqoop/hive/TestTableDefWriter.java
    hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/customDelimImport.q
    hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/dateImport.q
    hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/failingImport.q
    hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/normalImport.q
    hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/numericImport.q

Modified: hadoop/mapreduce/trunk/CHANGES.txt
URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/CHANGES.txt?rev=909237&r1=909236&r2=909237&view=diff
==============================================================================
--- hadoop/mapreduce/trunk/CHANGES.txt (original)
+++ hadoop/mapreduce/trunk/CHANGES.txt Fri Feb 12 05:16:32 2010
@@ -54,6 +54,9 @@
 
     MAPREDUCE-1433. Add a delegation token for MapReduce. (omalley)
 
+    MAPREDUCE-1341. Sqoop should have an option to create hive tables and
+    skip the table import step. (Leonid Furman via tomwhite)
+
   IMPROVEMENTS
 
     MAPREDUCE-1198. Alternatively schedule different types of tasks in

Modified: hadoop/mapreduce/trunk/src/contrib/sqoop/doc/Sqoop-manpage.txt
URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/src/contrib/sqoop/doc/Sqoop-manpage.txt?rev=909237&r1=909236&r2=909237&view=diff
==============================================================================
--- hadoop/mapreduce/trunk/src/contrib/sqoop/doc/Sqoop-manpage.txt (original)
+++ hadoop/mapreduce/trunk/src/contrib/sqoop/doc/Sqoop-manpage.txt Fri Feb 12 05:16:32 2010
@@ -98,6 +98,13 @@
 --hive-import::
   If set, then import the table into Hive
 
+--hive-create-only::
+  Creates table in hive and skips the data import step
+
+--hive-overwrite::
+  Overwrites existing table in hive.
+  By default it does not overwrite existing table.
+
 --table (table-name)::
   The table to import
 

Modified: hadoop/mapreduce/trunk/src/contrib/sqoop/doc/hive.txt
URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/src/contrib/sqoop/doc/hive.txt?rev=909237&r1=909236&r2=909237&view=diff
==============================================================================
--- hadoop/mapreduce/trunk/src/contrib/sqoop/doc/hive.txt (original)
+++ hadoop/mapreduce/trunk/src/contrib/sqoop/doc/hive.txt Fri Feb 12 05:16:32 2010
@@ -27,14 +27,18 @@
 into Hive is as simple as adding the *+--hive-import+* option to your
 Sqoop command line.
 
-After your data is imported into HDFS, Sqoop will generate a Hive
-script containing a +CREATE TABLE+ operation defining your columns using
-Hive's types, and a +LOAD DATA INPATH+ statement to move the data files
-into Hive's warehouse directory. The script will be executed by
-calling the installed copy of hive on the machine where Sqoop is run.
-If you have multiple Hive installations, or +hive+ is not in your
-+$PATH+ use the *+--hive-home+* option to identify the Hive installation
-directory. Sqoop will use +$HIVE_HOME/bin/hive+ from here.
+By default the data is imported into HDFS, but you can skip this operation
+by using the *+--hive-create+* option. Optionally, you can specify the
+*+--hive-overwrite+* option to indicate that existing table in hive must
+be replaced. After your data is imported into HDFS or this step is
+omitted, Sqoop will generate a Hive script containing a +CREATE TABLE+
+operation defining your columns using Hive's types, and a +LOAD DATA INPATH+
+statement to move the data files into Hive's warehouse directory if
+*+--hive-create+* option is not added. The script will be executed by calling
+the installed copy of hive on the machine where Sqoop is run. If you have
+multiple Hive installations, or +hive+ is not in your +$PATH+ use the
+*+--hive-home+* option to identify the Hive installation directory.
+Sqoop will use +$HIVE_HOME/bin/hive+ from here.
 
 NOTE: This function is incompatible with +--as-sequencefile+.
 

Modified: hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/Sqoop.java
URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/Sqoop.java?rev=909237&r1=909236&r2=909237&view=diff
==============================================================================
--- hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/Sqoop.java (original)
+++ hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/Sqoop.java Fri Feb 12 05:16:32 2010
@@ -118,9 +118,12 @@
     jarFile = generateORM(tableName);
 
     if (options.getAction() == SqoopOptions.ControlAction.FullImport) {
-      // Proceed onward to do the import.
-      ImportJobContext context = new ImportJobContext(tableName, jarFile, options);
-      manager.importTable(context);
+      // check if data import is to be performed
+      if (!options.doCreateHiveTableOnly()) {
+        // Proceed onward to do the import.
+        ImportJobContext context = new ImportJobContext(tableName, jarFile, options);
+        manager.importTable(context);
+      }
 
       // If the user wants this table to be in Hive, perform that post-load.
       if (options.doHiveImport()) {

Modified: hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/SqoopOptions.java
URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/SqoopOptions.java?rev=909237&r1=909236&r2=909237&view=diff
==============================================================================
--- hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/SqoopOptions.java (original)
+++ hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/SqoopOptions.java Fri Feb 12 05:16:32 2010
@@ -100,6 +100,8 @@
   private String tmpDir; // where temp data goes; usually /tmp
   private String hiveHome;
   private boolean hiveImport;
+  private boolean createHiveTableOnly;
+  private boolean overwriteHiveTable;
   private String hiveTableName;
   private String packageName; // package to prepend to auto-named classes.
   private String className; // package+class to apply to individual table import.
@@ -204,6 +206,8 @@
 
       this.direct = getBooleanProperty(props, "direct.import", this.direct);
       this.hiveImport = getBooleanProperty(props, "hive.import", this.hiveImport);
+      this.createHiveTableOnly = getBooleanProperty(props, "hive.create.table.only", this.createHiveTableOnly);
+      this.overwriteHiveTable = getBooleanProperty(props, "hive.overwrite.table", this.overwriteHiveTable);
       this.useCompression = getBooleanProperty(props, "compression", this.useCompression);
       this.directSplitSize = getLongProperty(props, "direct.split.size",
           this.directSplitSize);
@@ -513,6 +517,10 @@
           this.hiveHome = args[++i];
         } else if (args[i].equals("--hive-import")) {
           this.hiveImport = true;
+        } else if (args[i].equals("--hive-create-only")) {
+          this.createHiveTableOnly = true;
+        } else if (args[i].equals("--hive-overwrite")) {
+          this.overwriteHiveTable = true;
         } else if (args[i].equals("--hive-table")) {
           this.hiveTableName = args[++i];
         } else if (args[i].equals("--num-mappers") || args[i].equals("-m")) {
@@ -780,6 +788,20 @@
   }
 
   /**
+   * @return the user-specified option to create tables in hive with no loading
+   */
+  public boolean doCreateHiveTableOnly() {
+    return createHiveTableOnly;
+  }
+
+  /**
+   * @return the user-specified option to overwrite existing table in hive
+   */
+  public boolean doOverwriteHiveTable() {
+    return overwriteHiveTable;
+  }
+
+  /**
    * @return location where .java files go; guaranteed to end with '/'
    */
   public String getCodeOutputDir() {

Modified: hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/hive/HiveImport.java
URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/hive/HiveImport.java?rev=909237&r1=909236&r2=909237&view=diff
==============================================================================
--- hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/hive/HiveImport.java (original)
+++ hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/hive/HiveImport.java Fri Feb 12 05:16:32 2010
@@ -150,7 +150,9 @@
         FileOutputStream fos = new FileOutputStream(tempFile);
         w = new BufferedWriter(new OutputStreamWriter(fos));
         w.write(createTableStr, 0, createTableStr.length());
-        w.write(loadDataStmtStr, 0, loadDataStmtStr.length());
+        if (!options.doCreateHiveTableOnly()) {
+          w.write(loadDataStmtStr, 0, loadDataStmtStr.length());
+        }
       } catch (IOException ioe) {
         LOG.error("Error writing Hive load-in script: " + ioe.toString());
         ioe.printStackTrace();

Modified: hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/hive/TableDefWriter.java
URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/hive/TableDefWriter.java?rev=909237&r1=909236&r2=909237&view=diff
==============================================================================
--- hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/hive/TableDefWriter.java (original)
+++ hadoop/mapreduce/trunk/src/contrib/sqoop/src/java/org/apache/hadoop/sqoop/hive/TableDefWriter.java Fri Feb 12 05:16:32 2010
@@ -121,7 +121,11 @@
 
     String [] colNames = getColumnNames();
     StringBuilder sb = new StringBuilder();
-    sb.append("CREATE TABLE " + outputTableName + " ( ");
+    if (options.doOverwriteHiveTable()) {
+      sb.append("CREATE TABLE " + outputTableName + " ( ");
+    } else {
+      sb.append("CREATE TABLE IF NOT EXISTS " + outputTableName + " ( ");
+    }
 
     boolean first = true;
     for (String col : colNames) {

Modified: hadoop/mapreduce/trunk/src/contrib/sqoop/src/test/org/apache/hadoop/sqoop/hive/TestHiveImport.java
URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/src/contrib/sqoop/src/test/org/apache/hadoop/sqoop/hive/TestHiveImport.java?rev=909237&r1=909236&r2=909237&view=diff
==============================================================================
--- hadoop/mapreduce/trunk/src/contrib/sqoop/src/test/org/apache/hadoop/sqoop/hive/TestHiveImport.java (original)
+++ hadoop/mapreduce/trunk/src/contrib/sqoop/src/test/org/apache/hadoop/sqoop/hive/TestHiveImport.java Fri Feb 12 05:16:32 2010
@@ -109,6 +109,24 @@
     runImportTest("NORMAL_HIVE_IMPORT", types, vals, "normalImport.q", null);
   }
 
+  /** Test that table is created in hive with no data import */
+  @Test
+  public void testCreateOnlyHiveImport() throws IOException {
+    String [] types = { "VARCHAR(32)", "INTEGER", "CHAR(64)" };
+    String [] vals = { "'test'", "42", "'somestring'" };
+    String [] extraArgs = {"--hive-create-only"};
+    runImportTest("CREATE_ONLY_HIVE_IMPORT", types, vals, "createOnlyImport.q", extraArgs);
+  }
+
+  /** Test that table is created in hive and replaces the existing table if any */
+  @Test
+  public void testCreateOverwriteHiveImport() throws IOException {
+    String [] types = { "VARCHAR(32)", "INTEGER", "CHAR(64)" };
+    String [] vals = { "'test'", "42", "'somestring'" };
+    String [] extraArgs = {"--hive-create-only", "--hive-overwrite"};
+    runImportTest("CREATE_OVERWRITE_HIVE_IMPORT", types, vals, "createOverwriteImport.q", extraArgs);
+  }
+
   /** Test that dates are coerced properly to strings */
   @Test
   public void testDate() throws IOException {

Modified: hadoop/mapreduce/trunk/src/contrib/sqoop/src/test/org/apache/hadoop/sqoop/hive/TestTableDefWriter.java
URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/src/contrib/sqoop/src/test/org/apache/hadoop/sqoop/hive/TestTableDefWriter.java?rev=909237&r1=909236&r2=909237&view=diff
==============================================================================
--- hadoop/mapreduce/trunk/src/contrib/sqoop/src/test/org/apache/hadoop/sqoop/hive/TestTableDefWriter.java (original)
+++ hadoop/mapreduce/trunk/src/contrib/sqoop/src/test/org/apache/hadoop/sqoop/hive/TestTableDefWriter.java Fri Feb 12 05:16:32 2010
@@ -73,7 +73,7 @@
     LOG.debug("Load data stmt: " + loadData);
 
     // Assert that the statements generated have the form we expect.
-    assertTrue(createTable.indexOf("CREATE TABLE outputTable") != -1);
+    assertTrue(createTable.indexOf("CREATE TABLE IF NOT EXISTS outputTable") != -1);
     assertTrue(loadData.indexOf("INTO TABLE outputTable") != -1);
     assertTrue(loadData.indexOf("/inputTable'") != -1);
   }

Added: hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/createOnlyImport.q
URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/createOnlyImport.q?rev=909237&view=auto
==============================================================================
--- hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/createOnlyImport.q (added)
+++ hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/createOnlyImport.q Fri Feb 12 05:16:32 2010
@@ -0,0 +1 @@
+CREATE TABLE IF NOT EXISTS CREATE_ONLY_HIVE_IMPORT ( DATA_COL0 STRING, DATA_COL1 INT, DATA_COL2 STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE;

Added: hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/createOverwriteImport.q
URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/createOverwriteImport.q?rev=909237&view=auto
==============================================================================
--- hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/createOverwriteImport.q (added)
+++ hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/createOverwriteImport.q Fri Feb 12 05:16:32 2010
@@ -0,0 +1 @@
+CREATE TABLE CREATE_OVERWRITE_HIVE_IMPORT ( DATA_COL0 STRING, DATA_COL1 INT, DATA_COL2 STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE;

Modified: hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/customDelimImport.q
URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/customDelimImport.q?rev=909237&r1=909236&r2=909237&view=diff
==============================================================================
--- hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/customDelimImport.q (original)
+++ hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/customDelimImport.q Fri Feb 12 05:16:32 2010
@@ -1,2 +1,2 @@
-CREATE TABLE CUSTOM_DELIM_IMPORT ( DATA_COL0 STRING, DATA_COL1 INT, DATA_COL2 STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\054' LINES TERMINATED BY '\174' STORED AS TEXTFILE;
+CREATE TABLE IF NOT EXISTS CUSTOM_DELIM_IMPORT ( DATA_COL0 STRING, DATA_COL1 INT, DATA_COL2 STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\054' LINES TERMINATED BY '\174' STORED AS TEXTFILE;
 LOAD DATA INPATH 'file:BASEPATH/sqoop/warehouse/CUSTOM_DELIM_IMPORT' INTO TABLE CUSTOM_DELIM_IMPORT;

Modified: hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/dateImport.q
URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/dateImport.q?rev=909237&r1=909236&r2=909237&view=diff
==============================================================================
--- hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/dateImport.q (original)
+++ hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/dateImport.q Fri Feb 12 05:16:32 2010
@@ -1,2 +1,2 @@
-CREATE TABLE DATE_HIVE_IMPORT ( DATA_COL0 STRING, DATA_COL1 STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE;
+CREATE TABLE IF NOT EXISTS DATE_HIVE_IMPORT ( DATA_COL0 STRING, DATA_COL1 STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE;
 LOAD DATA INPATH 'file:BASEPATH/sqoop/warehouse/DATE_HIVE_IMPORT' INTO TABLE DATE_HIVE_IMPORT;

Modified: hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/failingImport.q
URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/failingImport.q?rev=909237&r1=909236&r2=909237&view=diff
==============================================================================
--- hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/failingImport.q (original)
+++ hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/failingImport.q Fri Feb 12 05:16:32 2010
@@ -1,2 +1,2 @@
-CREATE TABLE DATE_HIVE_IMPORT ( DATA_COL0 STRING, DATA_COL1 STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE;
+CREATE TABLE IF NOT EXISTS DATE_HIVE_IMPORT ( DATA_COL0 STRING, DATA_COL1 STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE;
 LOAD DATA INPATH 'file:BASEPATH/sqoop/warehouse/DATE_HIVE_IMPORT' INTO TABLE DATE_HIVE_IMPORT;

Modified: hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/normalImport.q
URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/normalImport.q?rev=909237&r1=909236&r2=909237&view=diff
==============================================================================
--- hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/normalImport.q (original)
+++ hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/normalImport.q Fri Feb 12 05:16:32 2010
@@ -1,2 +1,2 @@
-CREATE TABLE NORMAL_HIVE_IMPORT ( DATA_COL0 STRING, DATA_COL1 INT, DATA_COL2 STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE;
+CREATE TABLE IF NOT EXISTS NORMAL_HIVE_IMPORT ( DATA_COL0 STRING, DATA_COL1 INT, DATA_COL2 STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE;
 LOAD DATA INPATH 'file:BASEPATH/sqoop/warehouse/NORMAL_HIVE_IMPORT' INTO TABLE NORMAL_HIVE_IMPORT;

Modified: hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/numericImport.q
URL: http://svn.apache.org/viewvc/hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/numericImport.q?rev=909237&r1=909236&r2=909237&view=diff
==============================================================================
--- hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/numericImport.q (original)
+++ hadoop/mapreduce/trunk/src/contrib/sqoop/testdata/hive/scripts/numericImport.q Fri Feb 12 05:16:32 2010
@@ -1,2 +1,2 @@
-CREATE TABLE NUMERIC_HIVE_IMPORT ( DATA_COL0 DOUBLE, DATA_COL1 STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE;
+CREATE TABLE IF NOT EXISTS NUMERIC_HIVE_IMPORT ( DATA_COL0 DOUBLE, DATA_COL1 STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE;
 LOAD DATA INPATH 'file:BASEPATH/sqoop/warehouse/NUMERIC_HIVE_IMPORT' INTO TABLE NUMERIC_HIVE_IMPORT;