You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@pig.apache.org by "Cheolsoo Park (JIRA)" <ji...@apache.org> on 2012/10/23 22:37:13 UTC

[jira] [Commented] (PIG-2979) ant test-e2e-local fails due to missing run-time dependencies in classpath with hadoop-2.0.x

    [ https://issues.apache.org/jira/browse/PIG-2979?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13482657#comment-13482657 ] 

Cheolsoo Park commented on PIG-2979:
------------------------------------

There are 2 issues:

1) slf4j has to be bundled in pig.jar. Easy to fix.

2) java.io.IOException: No FileSystem for scheme: file

This is a regression from HADOOP-7549.

HADOOP-7549 made the FileSystem implementation be configured by ServiceLoader instead of by configuration files. As part of changes, the "fs.file.impl" property is removed from core-default.xml.

Now to map the scheme "file://" to the hadoop.fs.LocalFileSystem class, the fully qualified name of LocalFileSystem must be listed in the provider-configuration file (META-INF/services/org.apache.hadoop.fs.FileSystem) in pig.jar. However, it isn't.

Here is the diff of META-INF/services/org.apache.hadoop.fs.FileSystem of hadoop-commom.jar and pig.jar:
{code:title=hadoop-commom.jar}
org.apache.hadoop.fs.LocalFileSystem
org.apache.hadoop.fs.viewfs.ViewFileSystem
org.apache.hadoop.fs.s3.S3FileSystem
org.apache.hadoop.fs.s3native.NativeS3FileSystem
org.apache.hadoop.fs.kfs.KosmosFileSystem
org.apache.hadoop.fs.ftp.FTPFileSystem
org.apache.hadoop.fs.HarFileSystem
{code}
{code:title=pig.ajr}
org.apache.hadoop.hdfs.DistributedFileSystem
org.apache.hadoop.hdfs.HftpFileSystem
org.apache.hadoop.hdfs.HsftpFileSystem
org.apache.hadoop.hdfs.web.WebHdfsFileSystem
{code}
                
> ant test-e2e-local fails due to missing run-time dependencies in classpath with hadoop-2.0.x
> --------------------------------------------------------------------------------------------
>
>                 Key: PIG-2979
>                 URL: https://issues.apache.org/jira/browse/PIG-2979
>             Project: Pig
>          Issue Type: Sub-task
>            Reporter: Cheolsoo Park
>            Assignee: Cheolsoo Park
>             Fix For: 0.11
>
>
> To reproduce, please run on machine where no Hadoop is installed:
> {code}
> ant clean
> ant -Dharness.old.pig=old_pig -Dharness.cluster.conf=hadoop_conf_dir -Dharness.cluster.bin=hadoop_script test-e2e-deploy-local -Dhadoopversion=23
> ant -Dharness.old.pig=old_pig -Dharness.cluster.conf=hadoop_conf_dir -Dharness.cluster.bin=hadoop_script test-e2e-local -Dhadoopversion=23
> {code}
> The ant test-e2e-local fails with the following error:
> {code}
> java.lang.NoClassDefFoundError: org/slf4j/LoggerFactory
>         at org.apache.hadoop.security.authentication.util.KerberosName.<clinit>(KerberosName.java:42)
>         at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:211)
>         at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:274)
>         at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:531)
>         at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:512)
> {code}
> In fact, this is also an issue with running Pig in local mode with the fat jar where no hadoop dependencies are available in classpath. For example, the following command also fails with the same error:
> {code}
> ant clean jar -Dhadoopversion=23
> ./bin/pig -x local
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira