You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by "Aviem Zur (JIRA)" <ji...@apache.org> on 2017/05/12 11:28:04 UTC

[jira] [Updated] (BEAM-2277) IllegalArgumentException when using Hadoop file system for WordCount example.

     [ https://issues.apache.org/jira/browse/BEAM-2277?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Aviem Zur updated BEAM-2277:
----------------------------
    Description: 
IllegalArgumentException when using Hadoop file system for WordCount example.

Occurred when running WordCount example using Spark runner on a YARN cluster.

Command-line arguments: {code}
--inputFile=hdfs:///user/myuser/kinglear.txt --output=hdfs:///user/myuser/wc/
{code}

Stack trace:
{code}
java.lang.IllegalArgumentException: Expect srcResourceIds and destResourceIds have the same scheme, but received file, hdfs.
	at org.apache.beam.sdk.repackaged.com.google.common.base.Preconditions.checkArgument(Preconditions.java:122)
	at org.apache.beam.sdk.io.FileSystems.validateSrcDestLists(FileSystems.java:394)
	at org.apache.beam.sdk.io.FileSystems.copy(FileSystems.java:236)
	at org.apache.beam.sdk.io.FileBasedSink$WriteOperation.copyToOutputFiles(FileBasedSink.java:626)
	at org.apache.beam.sdk.io.FileBasedSink$WriteOperation.finalize(FileBasedSink.java:516)
	at org.apache.beam.sdk.io.WriteFiles$2.processElement(WriteFiles.java:592)
{code}

  was:
IllegalArgumentException when using Hadoop file system for WordCount example.

Occurred when running WordCount example using Spark runner on a YARN cluster.

Stack trace:
{code}
java.lang.IllegalArgumentException: Expect srcResourceIds and destResourceIds have the same scheme, but received file, hdfs.
	at org.apache.beam.sdk.repackaged.com.google.common.base.Preconditions.checkArgument(Preconditions.java:122)
	at org.apache.beam.sdk.io.FileSystems.validateSrcDestLists(FileSystems.java:394)
	at org.apache.beam.sdk.io.FileSystems.copy(FileSystems.java:236)
	at org.apache.beam.sdk.io.FileBasedSink$WriteOperation.copyToOutputFiles(FileBasedSink.java:626)
	at org.apache.beam.sdk.io.FileBasedSink$WriteOperation.finalize(FileBasedSink.java:516)
	at org.apache.beam.sdk.io.WriteFiles$2.processElement(WriteFiles.java:592)
{code}


> IllegalArgumentException when using Hadoop file system for WordCount example.
> -----------------------------------------------------------------------------
>
>                 Key: BEAM-2277
>                 URL: https://issues.apache.org/jira/browse/BEAM-2277
>             Project: Beam
>          Issue Type: Bug
>          Components: sdk-java-extensions
>            Reporter: Aviem Zur
>            Assignee: Davor Bonaci
>            Priority: Blocker
>             Fix For: 2.0.0
>
>
> IllegalArgumentException when using Hadoop file system for WordCount example.
> Occurred when running WordCount example using Spark runner on a YARN cluster.
> Command-line arguments: {code}
> --inputFile=hdfs:///user/myuser/kinglear.txt --output=hdfs:///user/myuser/wc/
> {code}
> Stack trace:
> {code}
> java.lang.IllegalArgumentException: Expect srcResourceIds and destResourceIds have the same scheme, but received file, hdfs.
> 	at org.apache.beam.sdk.repackaged.com.google.common.base.Preconditions.checkArgument(Preconditions.java:122)
> 	at org.apache.beam.sdk.io.FileSystems.validateSrcDestLists(FileSystems.java:394)
> 	at org.apache.beam.sdk.io.FileSystems.copy(FileSystems.java:236)
> 	at org.apache.beam.sdk.io.FileBasedSink$WriteOperation.copyToOutputFiles(FileBasedSink.java:626)
> 	at org.apache.beam.sdk.io.FileBasedSink$WriteOperation.finalize(FileBasedSink.java:516)
> 	at org.apache.beam.sdk.io.WriteFiles$2.processElement(WriteFiles.java:592)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)