You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by Kleiton Silva <kl...@gmail.com> on 2014/03/12 17:05:53 UTC

Help Sqoop Import

Hello my friends,

I have some doubt about sqoop and i hope you can you help me.

I am try import one table from mysql with two columns. when a try execute
the import with the following command:

start job --jid 2

I get this error:

2014-03-13 12:54:31 PDT: FAILURE_ON_SUBMIT
Exception: java.io.FileNotFoundException: File does not exist:
hdfs://oak:54310/usr/local/Cellar/hadoop/2.2.0/libexec/share/hadoop/common/lib/guava-11.0.2.jar

Command that i've had to do before this error:

hdfs dfs -mkdir /usr/lib/sqoop/lib
hdfs dfs -copyFromLocal /usr/lib/sqoop/lib/*.jar /usr/lib/sqoop/lib

hdfs dfs -mkdir -p /usr/lib/sqoop/server/webapps/sqoop/WEB-INF/lib
hdfs dfs -copyFromLocal
/usr/lib/sqoop/server/webapps/sqoop/WEB-INF/lib/*.jar
/usr/lib/sqoop/server/webapps/sqoop/WEB-INF/lib


Is really necessary copy all jars to HDFS or there is another smart
solution?



Thank you.

Kleiton Silva

Re: Help Sqoop Import

Posted by john zhao <jz...@alpinenow.com>.
First you need make sure you can submit MR job from your command line .
Then you need check your sqoop have the correct HADOOP_HOME environment 
to make sure it can get all the hadoop configuration.
Are using yarn or MR1? This probably happens when you  start the cluster 
in yarn mode and the HADOOP_HOME is pointed to MR1 folder .
At least for me, last time, that is the case.

John.
On 03/12/2014 10:28 AM, Kleiton Silva wrote:
> Hey John any idea what exactly i should verify on this configurations? 
> Yeah, i'm running locally on my computer.
>
> Thank you.
>
>
>
>
> On Wed, Mar 12, 2014 at 9:38 AM, John Zhao <jzhao@alpinenow.com 
> <ma...@alpinenow.com>> wrote:
>
>     No, you don not need manually copy the jar files.
>     Usualy this happens when you run in MR local mode with yarn. Check
>     your hadoop setting or sqoop setting to make sure you get the
>     correct job tracker.
>
>     John.
>
>     On Mar 12, 2014, at 9:05 AM, Kleiton Silva
>     <kleiton.contato@gmail.com <ma...@gmail.com>> wrote:
>
>     > Hello my friends,
>     >
>     > I have some doubt about sqoop and i hope you can you help me.
>     >
>     > I am try import one table from mysql with two columns. when a
>     try execute the import with the following command:
>     >
>     > start job --jid 2
>     >
>     > I get this error:
>     >
>     > 2014-03-13 12:54:31 PDT: FAILURE_ON_SUBMIT
>     > Exception: java.io.FileNotFoundException: File does not exist:
>     hdfs://oak:54310/usr/local/Cellar/hadoop/2.2.0/libexec/share/hadoop/common/lib/guava-11.0.2.jar
>     >
>     > Command that i've had to do before this error:
>     >
>     > hdfs dfs -mkdir /usr/lib/sqoop/lib
>     > hdfs dfs -copyFromLocal /usr/lib/sqoop/lib/*.jar /usr/lib/sqoop/lib
>     >
>     > hdfs dfs -mkdir -p /usr/lib/sqoop/server/webapps/sqoop/WEB-INF/lib
>     > hdfs dfs -copyFromLocal
>     /usr/lib/sqoop/server/webapps/sqoop/WEB-INF/lib/*.jar
>     /usr/lib/sqoop/server/webapps/sqoop/WEB-INF/lib
>     >
>     >
>     > Is really necessary copy all jars to HDFS or there is another
>     smart solution?
>     >
>     >
>     >
>     > Thank you.
>     >
>     > Kleiton Silva
>     >
>
>


Re: Help Sqoop Import

Posted by Kleiton Silva <kl...@gmail.com>.
Hey John any idea what exactly i should verify on this configurations?
Yeah, i'm running locally on my computer.

Thank you.




On Wed, Mar 12, 2014 at 9:38 AM, John Zhao <jz...@alpinenow.com> wrote:

> No, you don not need manually copy the jar files.
> Usualy this happens when you run in MR local mode with yarn. Check your
> hadoop setting or sqoop setting to make sure you get the correct job
> tracker.
>
> John.
>
> On Mar 12, 2014, at 9:05 AM, Kleiton Silva <kl...@gmail.com>
> wrote:
>
> > Hello my friends,
> >
> > I have some doubt about sqoop and i hope you can you help me.
> >
> > I am try import one table from mysql with two columns. when a try
> execute the import with the following command:
> >
> > start job --jid 2
> >
> > I get this error:
> >
> > 2014-03-13 12:54:31 PDT: FAILURE_ON_SUBMIT
> > Exception: java.io.FileNotFoundException: File does not exist:
> hdfs://oak:54310/usr/local/Cellar/hadoop/2.2.0/libexec/share/hadoop/common/lib/guava-11.0.2.jar
> >
> > Command that i've had to do before this error:
> >
> > hdfs dfs -mkdir /usr/lib/sqoop/lib
> > hdfs dfs -copyFromLocal /usr/lib/sqoop/lib/*.jar /usr/lib/sqoop/lib
> >
> > hdfs dfs -mkdir -p /usr/lib/sqoop/server/webapps/sqoop/WEB-INF/lib
> > hdfs dfs -copyFromLocal
> /usr/lib/sqoop/server/webapps/sqoop/WEB-INF/lib/*.jar
> /usr/lib/sqoop/server/webapps/sqoop/WEB-INF/lib
> >
> >
> > Is really necessary copy all jars to HDFS or there is another smart
> solution?
> >
> >
> >
> > Thank you.
> >
> > Kleiton Silva
> >
>
>

Re: Help Sqoop Import

Posted by John Zhao <jz...@alpinenow.com>.
No, you don not need manually copy the jar files.
Usualy this happens when you run in MR local mode with yarn. Check your hadoop setting or sqoop setting to make sure you get the correct job tracker.

John.

On Mar 12, 2014, at 9:05 AM, Kleiton Silva <kl...@gmail.com> wrote:

> Hello my friends, 
> 
> I have some doubt about sqoop and i hope you can you help me.
> 
> I am try import one table from mysql with two columns. when a try execute the import with the following command:
> 
> start job --jid 2
> 
> I get this error:
> 
> 2014-03-13 12:54:31 PDT: FAILURE_ON_SUBMIT 
> Exception: java.io.FileNotFoundException: File does not exist: hdfs://oak:54310/usr/local/Cellar/hadoop/2.2.0/libexec/share/hadoop/common/lib/guava-11.0.2.jar
> 
> Command that i've had to do before this error:
> 
> hdfs dfs -mkdir /usr/lib/sqoop/lib
> hdfs dfs -copyFromLocal /usr/lib/sqoop/lib/*.jar /usr/lib/sqoop/lib
> 
> hdfs dfs -mkdir -p /usr/lib/sqoop/server/webapps/sqoop/WEB-INF/lib
> hdfs dfs -copyFromLocal /usr/lib/sqoop/server/webapps/sqoop/WEB-INF/lib/*.jar /usr/lib/sqoop/server/webapps/sqoop/WEB-INF/lib
> 
> 
> Is really necessary copy all jars to HDFS or there is another smart solution?
> 
> 
> 
> Thank you.
> 
> Kleiton Silva
>