You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by John Zhao <jz...@alpinenow.com> on 2014/02/18 03:07:06 UTC

Error when export to Postgresql table with a capital column in Sqoop1.4.4 with CDH5B2

When I run the following sqoop command
./sqoop export --connect jdbc:postgresql://localhost:5432/miner_demo  --table golf_cap_col --export-dir /csv/golf_cap_col --input-fields-terminated-by ,  --username miner_demo --password miner_demo -m 1 -- --schema demo  


The input table is like this:
CREATE TABLE demo.golf_cap_col(outlook text, temperature integer,humidity integer,wind text, "PLAY" text) ;
INSERT INTO demo.golf_cap_col( outlook, temperature, humidity, wind, "PLAY") VALUES ('sunny', 80, 81, 'false', 'no')


Does anyone have any idea about this ?   Do you know whether this is an open issue? 

John Zhao



Re: Error when export to Postgresql table with a capital column in Sqoop1.4.4 with CDH5B2

Posted by John Zhao <jz...@alpinenow.com>.
I can only find the following log in yarn form :
http://localhost:8088/cluster/app/application_1392781107478_0001
But I don't think it help s to solve it. Can you just try in you local?

2014-02-18 19:40:32,576 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
TaskAttempt: [attempt_1392781107478_0001_m_000000_0] using
containerId: [container_1392781107478_0001_01_000002 on NM:
[10.0.0.25:49518]
2014-02-18 19:40:32,579 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
attempt_1392781107478_0001_m_000000_0 TaskAttempt Transitioned from
ASSIGNED to RUNNING
2014-02-18 19:40:32,579 INFO [AsyncDispatcher event handler]
org.apache.hadoop.mapreduce.v2.app.job.impl.TaskImpl:
task_1392781107478_0001_m_000000 Task Transitioned from SCHEDULED to
RUNNING
2014-02-18 19:40:33,341 INFO [RMCommunicator Allocator]
org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
getResources() for application_1392781107478_0001: ask=3 release= 0
newContainers=0 finishedContainers=0 resourcelimit=<memory:0,
vCores:0> knownNMs=1
2014-02-18 19:40:34,193 INFO [Socket Reader #1 for port 49604]
SecurityLogger.org.apache.hadoop.ipc.Server: Auth successful for
job_1392781107478_0001 (auth:SIMPLE)
2014-02-18 19:40:34,209 INFO [IPC Server handler 0 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID :
jvm_1392781107478_0001_m_000002 asked for a task
2014-02-18 19:40:34,209 INFO [IPC Server handler 0 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: JVM with ID:
jvm_1392781107478_0001_m_000002 given task:
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:40:35,094 INFO [IPC Server handler 1 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Status update from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:40:35,094 INFO [IPC Server handler 1 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
TaskAttempt attempt_1392781107478_0001_m_000000_0 is : 0.0
2014-02-18 19:40:37,792 INFO [IPC Server handler 2 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:40:40,806 INFO [IPC Server handler 3 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Status update from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:40:40,806 INFO [IPC Server handler 3 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Progress of
TaskAttempt attempt_1392781107478_0001_m_000000_0 is : 1.0
2014-02-18 19:40:43,813 INFO [IPC Server handler 4 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:40:46,815 INFO [IPC Server handler 5 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:40:49,818 INFO [IPC Server handler 6 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:40:52,820 INFO [IPC Server handler 7 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:40:55,823 INFO [IPC Server handler 8 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:40:58,826 INFO [IPC Server handler 9 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:01,828 INFO [IPC Server handler 10 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:04,830 INFO [IPC Server handler 11 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:07,832 INFO [IPC Server handler 12 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:10,834 INFO [IPC Server handler 13 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:13,837 INFO [IPC Server handler 14 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:16,840 INFO [IPC Server handler 15 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:19,842 INFO [IPC Server handler 16 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:22,844 INFO [IPC Server handler 17 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:25,847 INFO [IPC Server handler 18 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:28,849 INFO [IPC Server handler 19 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:31,852 INFO [IPC Server handler 20 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:34,854 INFO [IPC Server handler 21 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:37,857 INFO [IPC Server handler 22 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:40,859 INFO [IPC Server handler 23 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:43,862 INFO [IPC Server handler 24 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:46,864 INFO [IPC Server handler 25 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:49,867 INFO [IPC Server handler 26 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:52,868 INFO [IPC Server handler 27 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:55,870 INFO [IPC Server handler 28 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:41:58,872 INFO [IPC Server handler 29 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:01,875 INFO [IPC Server handler 0 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:04,878 INFO [IPC Server handler 1 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:07,880 INFO [IPC Server handler 2 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:10,882 INFO [IPC Server handler 3 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:13,884 INFO [IPC Server handler 4 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:16,885 INFO [IPC Server handler 5 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:19,887 INFO [IPC Server handler 6 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:22,889 INFO [IPC Server handler 7 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:25,890 INFO [IPC Server handler 8 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:28,892 INFO [IPC Server handler 9 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:31,894 INFO [IPC Server handler 10 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:34,896 INFO [IPC Server handler 11 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:37,897 INFO [IPC Server handler 12 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:40,900 INFO [IPC Server handler 13 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:43,903 INFO [IPC Server handler 14 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:46,905 INFO [IPC Server handler 15 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:49,908 INFO [IPC Server handler 16 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:52,910 INFO [IPC Server handler 17 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:55,912 INFO [IPC Server handler 18 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:42:58,915 INFO [IPC Server handler 19 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:43:01,916 INFO [IPC Server handler 20 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:43:04,919 INFO [IPC Server handler 21 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:43:07,920 INFO [IPC Server handler 22 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:43:10,923 INFO [IPC Server handler 23 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0
2014-02-18 19:43:13,925 INFO [IPC Server handler 24 on 49604]
org.apache.hadoop.mapred.TaskAttemptListenerImpl: Ping from
attempt_1392781107478_0001_m_000000_0



On Tue, Feb 18, 2014 at 2:35 PM, Abraham Elmahrek <ab...@cloudera.com> wrote:

> Ah that's really weird. Did you check the logs in Yarn and MR?
>
>
> On Tue, Feb 18, 2014 at 2:01 PM, john zhao <jz...@alpinenow.com> wrote:
>
>>  So here is the steps:
>> 1) I create a table in postgresql with a capital column name:
>>
>>         CREATE TABLE demo.golf_cap_col(outlook text, temperature
>> integer,humidity integer,wind text, "*PLAY*" text) ;
>>
>> These sql are executed from PgAdminIII, if you run from command line ,
>> you might need add extra quote and escape char I guess.But I only run them
>> from PGadmin.
>> Of cause you need have a schema named  demo.
>>
>> 2)Then I run the following sqoop command and the progress showing map
>> 100% finished and then console is dead.
>>
>>
>>  ./sqoop export --connect jdbc:postgresql://localhost:5432/miner_demo
>> --table golf_cap_col --export-dir /csv/golf_cap_col
>> --input-fields-terminated-by ,  --username miner_demo --password miner_demo
>> -m 1 -- --schema demo
>>
>> The file is onle one line like this :   sunny, 80, 81, false, no
>>
>> There is no error log and I found the yarn console always show that
>> process is running and never stop.Finally I have to manually kill it.
>>
>> Thanks.
>> John.
>>
>>
>> On 02/18/2014 01:42 PM, Abraham Elmahrek wrote:
>>
>> John,
>>
>>  Could you be more explicit about the problem you are referring to? I
>> don't think Sqoop will automatically create the table and the insert
>> command you've provided does work for me.
>>
>>  -Abe
>>
>>
>> On Mon, Feb 17, 2014 at 6:07 PM, John Zhao <jz...@alpinenow.com> wrote:
>>
>>> When I run the following sqoop command
>>> ./sqoop export --connect jdbc:postgresql://localhost:5432/miner_demo
>>> --table golf_cap_col --export-dir /csv/golf_cap_col
>>> --input-fields-terminated-by ,  --username miner_demo --password miner_demo
>>> -m 1 -- --schema demo
>>>
>>>
>>>  The input table is like this:
>>>  CREATE TABLE demo.golf_cap_col(outlook text, temperature
>>> integer,humidity integer,wind text, "*PLAY*" text) ;
>>>  INSERT INTO demo.golf_cap_col( outlook, temperature, humidity, wind, "
>>> *PLAY*") VALUES ('sunny', 80, 81, 'false', 'no');
>>>
>>>
>>>  Does anyone have any idea about this ?   Do you know whether this is
>>> an open issue?
>>>
>>>  John Zhao
>>>
>>>
>>>
>>
>>
>

Re: Error when export to Postgresql table with a capital column in Sqoop1.4.4 with CDH5B2

Posted by Abraham Elmahrek <ab...@cloudera.com>.
Ah that's really weird. Did you check the logs in Yarn and MR?


On Tue, Feb 18, 2014 at 2:01 PM, john zhao <jz...@alpinenow.com> wrote:

>  So here is the steps:
> 1) I create a table in postgresql with a capital column name:
>
>         CREATE TABLE demo.golf_cap_col(outlook text, temperature
> integer,humidity integer,wind text, "*PLAY*" text) ;
>
> These sql are executed from PgAdminIII, if you run from command line , you
> might need add extra quote and escape char I guess.But I only run them from
> PGadmin.
> Of cause you need have a schema named  demo.
>
> 2)Then I run the following sqoop command and the progress showing map 100%
> finished and then console is dead.
>
>
>  ./sqoop export --connect jdbc:postgresql://localhost:5432/miner_demo
> --table golf_cap_col --export-dir /csv/golf_cap_col
> --input-fields-terminated-by ,  --username miner_demo --password miner_demo
> -m 1 -- --schema demo
>
> The file is onle one line like this :   sunny, 80, 81, false, no
>
> There is no error log and I found the yarn console always show that
> process is running and never stop.Finally I have to manually kill it.
>
> Thanks.
> John.
>
>
> On 02/18/2014 01:42 PM, Abraham Elmahrek wrote:
>
> John,
>
>  Could you be more explicit about the problem you are referring to? I
> don't think Sqoop will automatically create the table and the insert
> command you've provided does work for me.
>
>  -Abe
>
>
> On Mon, Feb 17, 2014 at 6:07 PM, John Zhao <jz...@alpinenow.com> wrote:
>
>> When I run the following sqoop command
>> ./sqoop export --connect jdbc:postgresql://localhost:5432/miner_demo
>> --table golf_cap_col --export-dir /csv/golf_cap_col
>> --input-fields-terminated-by ,  --username miner_demo --password miner_demo
>> -m 1 -- --schema demo
>>
>>
>>  The input table is like this:
>>  CREATE TABLE demo.golf_cap_col(outlook text, temperature
>> integer,humidity integer,wind text, "*PLAY*" text) ;
>>  INSERT INTO demo.golf_cap_col( outlook, temperature, humidity, wind, "
>> *PLAY*") VALUES ('sunny', 80, 81, 'false', 'no');
>>
>>
>>  Does anyone have any idea about this ?   Do you know whether this is an
>> open issue?
>>
>>  John Zhao
>>
>>
>>
>
>

Re: Error when export to Postgresql table with a capital column in Sqoop1.4.4 with CDH5B2

Posted by john zhao <jz...@alpinenow.com>.
So here is the steps:
1) I create a table in postgresql with a capital column name:
         CREATE TABLE demo.golf_cap_col(outlook text, temperature 
integer,humidity integer,wind text, "*PLAY*" text) ;

These sql are executed from PgAdminIII, if you run from command line , 
you might need add extra quote and escape char I guess.But I only run 
them from PGadmin.
Of cause you need have a schema named  demo.

2)Then I run the following sqoop command and the progress showing map 
100% finished and then console is dead.

./sqoop export --connect jdbc:postgresql://localhost:5432/miner_demo  
--table golf_cap_col --export-dir /csv/golf_cap_col 
--input-fields-terminated-by ,  --username miner_demo --password 
miner_demo -m 1 -- --schema demo

The file is onle one line like this :   sunny, 80, 81, false, no

There is no error log and I found the yarn console always show that 
process is running and never stop.Finally I have to manually kill it.

Thanks.
John.

On 02/18/2014 01:42 PM, Abraham Elmahrek wrote:
> John,
>
> Could you be more explicit about the problem you are referring to? I 
> don't think Sqoop will automatically create the table and the insert 
> command you've provided does work for me.
>
> -Abe
>
>
> On Mon, Feb 17, 2014 at 6:07 PM, John Zhao <jzhao@alpinenow.com 
> <ma...@alpinenow.com>> wrote:
>
>     When I run the following sqoop command
>     ./sqoop export --connect
>     jdbc:postgresql://localhost:5432/miner_demo --table golf_cap_col
>     --export-dir /csv/golf_cap_col --input-fields-terminated-by , 
>     --username miner_demo --password miner_demo -m 1 -- --schema demo
>
>
>     The input table is like this:
>     CREATE TABLE demo.golf_cap_col(outlook text, temperature
>     integer,humidity integer,wind text, "*PLAY*" text) ;
>     INSERT INTO demo.golf_cap_col( outlook, temperature, humidity,
>     wind, "*PLAY*") VALUES ('sunny', 80, 81, 'false', 'no');
>
>
>     Does anyone have any idea about this ?   Do you know whether this
>     is an open issue?
>
>     John Zhao
>
>
>


Re: Error when export to Postgresql table with a capital column in Sqoop1.4.4 with CDH5B2

Posted by Abraham Elmahrek <ab...@cloudera.com>.
John,

Could you be more explicit about the problem you are referring to? I don't
think Sqoop will automatically create the table and the insert command
you've provided does work for me.

-Abe


On Mon, Feb 17, 2014 at 6:07 PM, John Zhao <jz...@alpinenow.com> wrote:

> When I run the following sqoop command
> ./sqoop export --connect jdbc:postgresql://localhost:5432/miner_demo
> --table golf_cap_col --export-dir /csv/golf_cap_col
> --input-fields-terminated-by ,  --username miner_demo --password miner_demo
> -m 1 -- --schema demo
>
>
> The input table is like this:
>  CREATE TABLE demo.golf_cap_col(outlook text, temperature
> integer,humidity integer,wind text, "*PLAY*" text) ;
> INSERT INTO demo.golf_cap_col( outlook, temperature, humidity, wind, "
> *PLAY*") VALUES ('sunny', 80, 81, 'false', 'no')
>
>
> Does anyone have any idea about this ?   Do you know whether this is an
> open issue?
>
> John Zhao
>
>
>