You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@sqoop.apache.org by "William Watson (JIRA)" <ji...@apache.org> on 2014/06/03 16:15:02 UTC

[jira] [Updated] (SQOOP-1333) Sqoop Fails to Import from PostgreSQL to S3 with Confusing "Imported Failed: null" exception

     [ https://issues.apache.org/jira/browse/SQOOP-1333?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

William Watson updated SQOOP-1333:
----------------------------------

    Description: 
I see some issues resolved with importing from MySQL to S3 (SQOOP-891), but I can't find any information on the following command and error:

{code}
sqoop import -Dfs.defaultFS=s3n:// --connect "jdbc:postgresql://ip_address/cleanroom_matching?extra_options=-Dfs.defaultFS%3Ds3n%3A%2F%2F" --fields-terminated-by \\t --username [omitted] --password [omitted] --split-by cr_user_id  --query "SELECT * FROM table WHERE (\$CONDITIONS)" --direct --delete-target-dir --target-dir 's3n://[omitted]:[omitted]@[omitted]/sqoop/'  --verbose
{code}

{code}
14/06/03 09:49:53 ERROR tool.ImportTool: Imported Failed: null
{code}

That's it, there's no stack trace. The query works on its own, if I import to disk it works just fine. It's when I change to S3 that it fails. It was failing because I didn't set the default file system after I did that, I started getting the confusing error.

  was:
I see some issues resolved with importing from MySQL to S3 (SQOOP-891), but I can't find any information on the following command and error:

{code}
sqoop import -Dfs.defaultFS=s3n:// --connect "jdbc:postgresql://33.33.33.12/cleanroom_matching?extra_options=-Dfs.defaultFS%3Ds3n%3A%2F%2F" --fields-terminated-by \\t --username cleanroom_user --password waterfall --split-by cr_user_id  --query "SELECT cr_user_id, ad_day, ad_hour, agency_organization_id, advertiser_organization_id, pixel_id, av1, av2, av3, av4, av5, av6, av7, av8, av9, number_of_actions, lvl FROM (SELECT
  	cr_user_id,
  	ad_day,
  	ad_hour,
  	agency_organization_id,
  	advertiser_organization_id,
  	pixel_id,
  	av1,
  	av2,
  	av3,
  	av4,
  	av5,
  	av6,
  	av7,
  	av8,
  	av9,
      number_of_actions,
      lvl
  FROM match_id_to_cr_user_id_mapping
  JOIN match_manifest USING (match_id)
  JOIN cleanroom_actions USING (match_id)) as transform_columns  WHERE (\$CONDITIONS)" --direct --delete-target-dir --target-dir 's3n://AKIAIACJB644REMVVE7Q:ehmI6SGPzARCGf0+m+ModcKQxnK80u5ujCs3NQDf@korrelate-test/sqoop/ko2o.action-event-matched-users-test' --map-column-java cr_user_id=String,ad_day=String,ad_hour=String,agency_organization_id=String,advertiser_organization_id=String,pixel_id=String,av1=String,av2=String,av3=String,av4=String,av5=String,av6=String,av7=String,av8=String,av9=String,number_of_actions=String,lvl=String --null-string '' --null-non-string '' --verbose
{code}

{code}
14/06/03 09:49:53 ERROR tool.ImportTool: Imported Failed: null
{code}

That's it, there's no stack trace. The query works on its own, if I import to disk it works just fine. It's when I change to S3 that it fails. It was failing because I didn't set the default file system after I did that, I started getting the confusing error.


> Sqoop Fails to Import from PostgreSQL to S3 with Confusing "Imported Failed: null" exception
> --------------------------------------------------------------------------------------------
>
>                 Key: SQOOP-1333
>                 URL: https://issues.apache.org/jira/browse/SQOOP-1333
>             Project: Sqoop
>          Issue Type: Bug
>    Affects Versions: 1.4.4
>         Environment: CentOS 6, Hadoop 2, Sqoop 1.4.4
>            Reporter: William Watson
>
> I see some issues resolved with importing from MySQL to S3 (SQOOP-891), but I can't find any information on the following command and error:
> {code}
> sqoop import -Dfs.defaultFS=s3n:// --connect "jdbc:postgresql://ip_address/cleanroom_matching?extra_options=-Dfs.defaultFS%3Ds3n%3A%2F%2F" --fields-terminated-by \\t --username [omitted] --password [omitted] --split-by cr_user_id  --query "SELECT * FROM table WHERE (\$CONDITIONS)" --direct --delete-target-dir --target-dir 's3n://[omitted]:[omitted]@[omitted]/sqoop/'  --verbose
> {code}
> {code}
> 14/06/03 09:49:53 ERROR tool.ImportTool: Imported Failed: null
> {code}
> That's it, there's no stack trace. The query works on its own, if I import to disk it works just fine. It's when I change to S3 that it fails. It was failing because I didn't set the default file system after I did that, I started getting the confusing error.



--
This message was sent by Atlassian JIRA
(v6.2#6252)