You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@pig.apache.org by "Tsz Wo (Nicholas), SZE (JIRA)" <ji...@apache.org> on 2010/02/10 21:04:32 UTC

[jira] Commented: (PIG-1234) Unable to create input slice for har:// files

    [ https://issues.apache.org/jira/browse/PIG-1234?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12832162#action_12832162 ] 

Tsz Wo (Nicholas), SZE commented on PIG-1234:
---------------------------------------------

More error messages:
{noformat}
Backend error message during job submission
-------------------------------------------
org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Unable to create input slice for: har://hdfs-namenode/user/tsz/t20.har/t20
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:269)
	at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:810)
	at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:781)
	at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:730)
	at org.apache.hadoop.mapred.jobcontrol.Job.submit(Job.java:378)
	at org.apache.hadoop.mapred.jobcontrol.JobControl.startReadyJobs(JobControl.java:247)
	at org.apache.hadoop.mapred.jobcontrol.JobControl.run(JobControl.java:279)
	at java.lang.Thread.run(Thread.java:619)
Caused by: java.lang.IllegalArgumentException: Wrong FS: har://hdfs-namenode/user/tsz/t20.har/t20, expected: hdfs://namenode
	at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
	at org.apache.hadoop.hdfs.DistributedFileSystem.checkPath(DistributedFileSystem.java:99)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:155)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:453)
	at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:648)
	at org.apache.pig.backend.hadoop.datastorage.HDataStorage.isContainer(HDataStorage.java:203)
	at org.apache.pig.backend.hadoop.datastorage.HDataStorage.asElement(HDataStorage.java:131)
	at org.apache.pig.impl.io.FileLocalizer.fileExists(FileLocalizer.java:553)
	at org.apache.pig.backend.executionengine.PigSlicer.validate(PigSlicer.java:123)
	at org.apache.pig.impl.io.ValidatingInputFileSpec.validate(ValidatingInputFileSpec.java:59)
	at org.apache.pig.impl.io.ValidatingInputFileSpec.<init>(ValidatingInputFileSpec.java:44)
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:240)
	... 7 more

Pig Stack Trace
---------------
ERROR 2118: Unable to create input slice for: har://hdfs-namenode/user/tsz/t20.har/t20

org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias a
	at org.apache.pig.PigServer.openIterator(PigServer.java:482)
	at org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:539)
	at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:241)
	at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:168)
	at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:144)
	at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:75)
	at org.apache.pig.Main.main(Main.java:352)
Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR 6015: During execution, encountered a Hadoop error.
	at .apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:269)
	at .apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:810)
	at .apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:781)
	at .apache.hadoop.mapred.JobClient.submitJob(JobClient.java:730)
	at .apache.hadoop.mapred.jobcontrol.Job.submit(Job.java:378)
	at .apache.hadoop.mapred.jobcontrol.JobControl.startReadyJobs(JobControl.java:247)
	at .apache.hadoop.mapred.jobcontrol.JobControl.run(JobControl.java:279)
	at .lang.Thread.run(Thread.java:619)
Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Unable to create input slice for: har://hdfs-namenode/user/tsz/t20.har/t20
	... 8 more
Caused by: java.lang.IllegalArgumentException: Wrong FS: har://hdfs-namenode/user/tsz/t20.har/t20, expected: hdfs://namenode
	at .apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:310)
	at .apache.hadoop.hdfs.DistributedFileSystem.checkPath(DistributedFileSystem.java:99)
	at .apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:155)
	at .apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:453)
	at .apache.hadoop.fs.FileSystem.exists(FileSystem.java:648)
	at .apache.pig.backend.hadoop.datastorage.HDataStorage.isContainer(HDataStorage.java:203)
	at .apache.pig.backend.hadoop.datastorage.HDataStorage.asElement(HDataStorage.java:131)
	at .apache.pig.impl.io.FileLocalizer.fileExists(FileLocalizer.java:553)
	at .apache.pig.backend.executionengine.PigSlicer.validate(PigSlicer.java:123)
	at .apache.pig.impl.io.ValidatingInputFileSpec.validate(ValidatingInputFileSpec.java:59)
	at .apache.pig.impl.io.ValidatingInputFileSpec.<init>(ValidatingInputFileSpec.java:44)
	at .apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:240)
================================================================================
{noformat}

> Unable to create input slice for har:// files
> ---------------------------------------------
>
>                 Key: PIG-1234
>                 URL: https://issues.apache.org/jira/browse/PIG-1234
>             Project: Pig
>          Issue Type: Bug
>            Reporter: Tsz Wo (Nicholas), SZE
>
> Tried to load har:// files
> {noformat}
> grunt> a = LOAD 'har://hdfs-namenode/user/tsz/t20.har/t20' USING PigStorage('\n') AS (line);
> grunt> dump 
> {noformat}
> but pig says
> {noformat}
> 2010-02-10 18:42:20,750 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2118:
>  Unable to create input slice for: har://hdfs-namenode/user/tsz/t20.har/t20
> {noformat}

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.