You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@sqoop.apache.org by "Eric Lin (JIRA)" <ji...@apache.org> on 2017/05/30 09:56:04 UTC

[jira] [Commented] (SQOOP-3188) Sqoop1 (import + --target-dir) with empty directory (/usr/lib/hive) fails with error (java.lang.NoClassDefFoundError: org/json/JSONObject)

    [ https://issues.apache.org/jira/browse/SQOOP-3188?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16029229#comment-16029229 ] 

Eric Lin commented on SQOOP-3188:
---------------------------------

Hi [~markuskemper@me.com],

I tried to test this against trunk code, it does NOT seem to be an issue for me. I created /usr/lib/hive directory with 777 permission and nothing under it, my sqoop import works as normal. Maybe it is environment specific?

> Sqoop1 (import + --target-dir) with empty directory (/usr/lib/hive) fails with error (java.lang.NoClassDefFoundError: org/json/JSONObject)
> ------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SQOOP-3188
>                 URL: https://issues.apache.org/jira/browse/SQOOP-3188
>             Project: Sqoop
>          Issue Type: Bug
>            Reporter: Markus Kemper
>
> Sqoop1 (import + --target-dir) with empty directory (/usr/lib/hive) fails with error (java.lang.NoClassDefFoundError: org/json/JSONObject), see test case below.
> *Test Case*
> {noformat}
> #################
> # STEP 01 - Create Table and Data
> #################
> export MYCONN=jdbc:mysql://mysql.sqoop.com:3306/sqoop
> export MYUSER=sqoop
> export MYPSWD=sqoop
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "drop table t1"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "create table t1 (c1 int, c2 date, c3 varchar(10))"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "insert into t1 values (1, current_date, 'some data')"
> sqoop eval --connect $MYCONN --username $MYUSER --password $MYPSWD --query "select * from t1"
> Output:
> -----------------------------------------
> | c1          | c2         | c3         | 
> -----------------------------------------
> | 1           | 2017-05-10 | some data  | 
> -----------------------------------------
> #################
> # STEP 02 - Import Data into HDFS 
> #################
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table t1 --target-dir /user/root/t1 --delete-target-dir --num-mappers 1
> hdfs dfs -cat /user/root/t1/part*
> Output:
> 17/05/10 13:46:24 INFO mapreduce.ImportJobBase: Transferred 23 bytes in 22.65 seconds (1.0155 bytes/sec)
> 17/05/10 13:46:24 INFO mapreduce.ImportJobBase: Retrieved 1 records.
> ~~~~~
> 1,2017-05-10,some data
> #################
> # STEP 03 - Create Bogus Hive Directory and Attempt to Import into HDFS
> #################
> mkdir /usr/lib/hive
> chmod 777 /usr/lib/hive
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table t1 --target-dir /user/root/t1 --delete-target-dir --num-mappers 1
> Output:
> 17/05/10 13:47:44 INFO mapreduce.ImportJobBase: Beginning import of t1
> Exception in thread "main" java.lang.NoClassDefFoundError: org/json/JSONObject
> 	at org.apache.sqoop.util.SqoopJsonUtil.getJsonStringforMap(SqoopJsonUtil.java:43)
> 	at org.apache.sqoop.SqoopOptions.writeProperties(SqoopOptions.java:776)
> 	at org.apache.sqoop.mapreduce.JobBase.putSqoopOptionsToConfiguration(JobBase.java:388)
> 	at org.apache.sqoop.mapreduce.JobBase.createJob(JobBase.java:374)
> 	at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:256)
> 	at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
> 	at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
> 	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:513)
> 	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
> 	at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
> 	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> 	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
> 	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
> 	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
> 	at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
> Caused by: java.lang.ClassNotFoundException: org.json.JSONObject
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> 	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> 	... 15 more
> #################
> # STEP 04 - Remove Bogus Hive Directory and Attempt to Import into HDFS 
> #################
> rm -rf /usr/lib/hive
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table t1 --target-dir /user/root/t1 --delete-target-dir --num-mappers 1
> hdfs dfs -cat /user/root/t1/part*
> Output:
> 17/05/10 13:52:30 INFO mapreduce.ImportJobBase: Transferred 23 bytes in 22.6361 seconds (1.0161 bytes/sec)
> 17/05/10 13:52:30 INFO mapreduce.ImportJobBase: Retrieved 1 records.
> ~~~~~
> 1,2017-05-10,some data
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)