You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Kumar Jayapal <kj...@gmail.com> on 2015/12/30 01:08:54 UTC

Getting ERROR 2118: Permission denied: while running PIG script using HCATALOG

Hi,

When I run this simple pig script from pig editor in hue I get permission
denied error. I can execute queries in hive as the same user any idea why?

We are using sentry for authorisation.


Here is my pig script.


LOAD_TBL_A = LOAD 'sandbox.suppliers' USING
org.apache.hive.hcatalog.pig.HCatLoader();

STORE LOAD_TBL_A INTO '/tmp/pig_testing001/';




Apache Pig version 0.12.0-cdh5.4.5 (rexported)
compiled Aug 12 2015, 14:17:24

Run pig script using PigRunner.run() for Pig version 0.8+
2015-12-30 00:00:42,435 [uber-SubtaskRunner] INFO  org.apache.pig.Main
 - Apache Pig version 0.12.0-cdh5.4.5 (rexported) compiled Aug 12
2015, 14:17:24
2015-12-30 00:00:42,437 [uber-SubtaskRunner] INFO  org.apache.pig.Main
 - Logging error messages to:
/mnt/drive08-sdj/nm/usercache/edhadmsvc/appcache/application_1449847448721_0473/container_e64_1449847448721_0473_01_000001/pig-job_1449847448721_0473.log
<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/mnt/drive08-sdj/nm/usercache/edhadmsvc/appcache/application_1449847448721_0473/container_e64_1449847448721_0473_01_000001/pig-job_1449847448721_0473.log>
2015-12-30 00:00:42,487 [uber-SubtaskRunner] INFO
org.apache.pig.impl.util.Utils  - Default bootup file
/home/edhadmsvc/.pigbootup
<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/home/edhadmsvc/.pigbootup>
not found
2015-12-30 00:00:42,617 [uber-SubtaskRunner] INFO
org.apache.hadoop.conf.Configuration.deprecation  - mapred.job.tracker
is deprecated. Instead, use mapreduce.jobtracker.address
2015-12-30 00:00:42,617 [uber-SubtaskRunner] INFO
org.apache.hadoop.conf.Configuration.deprecation  - fs.default.name is
deprecated. Instead, use fs.defaultFS
2015-12-30 00:00:42,617 [uber-SubtaskRunner] INFO
org.apache.pig.backend.hadoop.executionengine.HExecutionEngine  -
Connecting to hadoop file system at: hdfs://nameservice1
<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/>
2015-12-30 00:00:42,623 [uber-SubtaskRunner] INFO
org.apache.pig.backend.hadoop.executionengine.HExecutionEngine  -
Connecting to map-reduce job tracker at: yarnRM
2015-12-30 00:00:42,627 [uber-SubtaskRunner] WARN
org.apache.pig.PigServer  - Empty string specified for jar path
2015-12-30 00:00:43,320 [uber-SubtaskRunner] INFO  hive.metastore  -
Trying to connect to metastore with URI
thrift://hmscdh01094p001.corp.costco.com:9083
2015-12-30 00:00:43,387 [uber-SubtaskRunner] INFO  hive.metastore  -
Opened a connection to metastore, current connections: 1
2015-12-30 00:00:43,388 [uber-SubtaskRunner] INFO  hive.metastore  -
Connected to metastore.
2015-12-30 00:00:43,658 [uber-SubtaskRunner] INFO
org.apache.pig.tools.pigstats.ScriptState  - Pig features used in the
script: UNKNOWN
2015-12-30 00:00:43,750 [uber-SubtaskRunner] INFO
org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer  -
{RULES_ENABLED=[AddForEach, ColumnMapKeyPrune,
DuplicateForEachColumnRewrite, GroupByConstParallelSetter,
ImplicitSplitInserter, LimitOptimizer, LoadTypeCastInserter,
MergeFilter, MergeForEach, NewPartitionFilterOptimizer,
PushDownForEachFlatten, PushUpFilter, SplitFilter,
StreamTypeCastInserter],
RULES_DISABLED=[FilterLogicExpressionSimplifier,
PartitionFilterOptimizer]}
2015-12-30 00:00:43,769 [uber-SubtaskRunner] INFO
org.apache.hadoop.conf.Configuration.deprecation  - fs.default.name is
deprecated. Instead, use fs.defaultFS
2015-12-30 00:00:43,772 [uber-SubtaskRunner] INFO
org.apache.hadoop.conf.Configuration.deprecation  -
mapred.textoutputformat.separator is deprecated. Instead, use
mapreduce.output.textoutputformat.separator
2015-12-30 00:00:43,856 [uber-SubtaskRunner] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler
 - File concatenation threshold: 100 optimistic? false
2015-12-30 00:00:43,921 [uber-SubtaskRunner] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer
 - MR plan size before optimization: 1
2015-12-30 00:00:43,921 [uber-SubtaskRunner] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer
 - MR plan size after optimization: 1
2015-12-30 00:00:44,006 [uber-SubtaskRunner] INFO
org.apache.pig.tools.pigstats.ScriptState  - Pig script settings are
added to the job
2015-12-30 00:00:44,044 [uber-SubtaskRunner] INFO
org.apache.hadoop.conf.Configuration.deprecation  -
mapred.job.reduce.markreset.buffer.percent is deprecated. Instead, use
mapreduce.reduce.markreset.buffer.percent
2015-12-30 00:00:44,044 [uber-SubtaskRunner] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
 - mapred.job.reduce.markreset.buffer.percent is not set, set to
default 0.3
2015-12-30 00:00:44,044 [uber-SubtaskRunner] INFO
org.apache.hadoop.conf.Configuration.deprecation  -
mapred.output.compress is deprecated. Instead, use
mapreduce.output.fileoutputformat.compress
2015-12-30 00:00:44,266 [uber-SubtaskRunner] INFO
org.apache.hadoop.conf.Configuration.deprecation  - fs.default.name is
deprecated. Instead, use fs.defaultFS
2015-12-30 00:00:44,267 [uber-SubtaskRunner] INFO
org.apache.hadoop.conf.Configuration.deprecation  - mapred.task.id is
deprecated. Instead, use mapreduce.task.attempt.id
2015-12-30 00:00:44,267 [uber-SubtaskRunner] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
 - creating jar file Job4443028594885224634.jar
2015-12-30 00:00:47,524 [uber-SubtaskRunner] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
 - jar file Job4443028594885224634.jar created
2015-12-30 00:00:47,524 [uber-SubtaskRunner] INFO
org.apache.hadoop.conf.Configuration.deprecation  - mapred.jar is
deprecated. Instead, use mapreduce.job.jar
2015-12-30 00:00:47,550 [uber-SubtaskRunner] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
 - Setting up single store job
2015-12-30 00:00:47,617 [uber-SubtaskRunner] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
 - 1 map-reduce job(s) waiting for submission.
2015-12-30 00:00:47,618 [uber-SubtaskRunner] INFO
org.apache.hadoop.conf.Configuration.deprecation  -
mapred.job.tracker.http.address is deprecated. Instead, use
mapreduce.jobtracker.http.address
2015-12-30 00:00:47,667 [JobControl] INFO
org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider  -
Failing over to rm1199
2015-12-30 00:00:47,929 [communication thread] INFO
org.apache.hadoop.mapred.TaskAttemptListenerImpl  - Progress of
TaskAttempt attempt_1449847448721_0473_m_000000_0 is : 1.0
2015-12-30 00:00:48,076 [JobControl] INFO
org.apache.hadoop.conf.Configuration.deprecation  - mapred.input.dir
is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir
2015-12-30 00:00:48,103 [JobControl] INFO
org.apache.hadoop.mapreduce.JobSubmitter  - Cleaning up the staging
area job_1449847448721_0474
<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/edhadmsvc/.staging/%3Ca%20href=>"
target="_blank">/user/edhadmsvc/.staging/job_1449847448721_0474
<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>
2015-12-30 00:00:48,112 [JobControl] WARN
org.apache.hadoop.security.UserGroupInformation  -
PriviledgedActionException as:edhadmsvc (auth:SIMPLE)
cause:org.apache.pig.backend.executionengine.ExecException: ERROR
2118: Permission denied: user=edhadmsvc, access=EXECUTE,
inode="/user/hive/warehouse
<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

2015-12-30 00:00:48,113 [JobControl] INFO
org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob  -
PigLatin:script.pig got an error while submitting
org.apache.pig.backend.executionengine.ExecException: ERROR 2118:
Permission denied: user=edhadmsvc, access=EXECUTE,
inode="/user/hive/warehouse
<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:288)
	at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:597)
	at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:614)
	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306)
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1303)
	at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
	at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:191)
	at java.lang.Thread.run(Thread.java:745)
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:270)
Caused by: org.apache.hadoop.security.AccessControlException:
Permission denied: user=edhadmsvc, access=EXECUTE,
inode="/user/hive/warehouse
<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1984)
	at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1128)
	at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1124)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1124)
	at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
	at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
	at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1644)
	at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:257)
	at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228)
	at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
	at org.apache.hive.hcatalog.mapreduce.HCatBaseInputFormat.getSplits(HCatBaseInputFormat.java:157)
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274)
	... 18 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
Permission denied: user=edhadmsvc, access=EXECUTE,
inode="/user/hive/warehouse
<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

	at org.apache.hadoop.ipc.Client.call(Client.java:1468)
	at org.apache.hadoop.ipc.Client.call(Client.java:1399)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
	at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
	at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
	at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1982)
	... 30 more
2015-12-30 00:00:48,119 [uber-SubtaskRunner] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
 - HadoopJobId: job_1449847448721_0474
<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>
2015-12-30 00:00:48,120 [uber-SubtaskRunner] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
 - Processing aliases LOAD_TBL_A
2015-12-30 00:00:48,120 [uber-SubtaskRunner] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
 - detailed locations: M: LOAD_TBL_A[1,13] C:  R:
2015-12-30 00:00:48,123 [uber-SubtaskRunner] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
 - 0% complete
2015-12-30 00:00:53,133 [uber-SubtaskRunner] WARN
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
 - Ooops! Some job has failed! Specify -stop_on_failure if you want
Pig to stop immediately on failure.
2015-12-30 00:00:53,133 [uber-SubtaskRunner] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
 - job job_1449847448721_0474
<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>
has failed! Stop running all dependent jobs
2015-12-30 00:00:53,133 [uber-SubtaskRunner] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
 - 100% complete
2015-12-30 00:00:53,202 [uber-SubtaskRunner] INFO
org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider  -
Failing over to rm1199
2015-12-30 00:00:53,207 [uber-SubtaskRunner] INFO
org.apache.hadoop.mapred.ClientServiceDelegate  - Could not get Job
info from RM for job job_1449847448721_0474. Redirecting to job
history server.
2015-12-30 00:00:53,245 [uber-SubtaskRunner] INFO
org.apache.hadoop.mapred.ClientServiceDelegate  - Could not get Job
info from RM for job job_1449847448721_0474. Redirecting to job
history server.
2015-12-30 00:00:53,260 [uber-SubtaskRunner] ERROR
org.apache.pig.tools.pigstats.PigStatsUtil  - 1 map reduce job(s)
failed!
2015-12-30 00:00:53,262 [uber-SubtaskRunner] INFO
org.apache.pig.tools.pigstats.SimplePigStats  - Script Statistics:

HadoopVersion	PigVersion	UserId	StartedAt	FinishedAt	Features
2.6.0-cdh5.4.5	0.12.0-cdh5.4.5	edhadmsvc	2015-12-30
00:00:44	2015-12-30 00:00:53	UNKNOWN

Failed!

Failed Jobs:
JobId	Alias	Feature	Message	Outputs
job_1449847448721_0474
<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>	LOAD_TBL_A	MAP_ONLY	Message:
org.apache.pig.backend.executionengine.ExecException: ERROR 2118:
Permission denied: user=edhadmsvc, access=EXECUTE,
inode="/user/hive/warehouse
<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:288)
	at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:597)
	at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:614)
	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306)
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1303)
	at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
	at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:191)
	at java.lang.Thread.run(Thread.java:745)
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:270)
Caused by: org.apache.hadoop.security.AccessControlException:
Permission denied: user=edhadmsvc, access=EXECUTE,
inode="/user/hive/warehouse
<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1984)
	at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1128)
	at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1124)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1124)
	at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
	at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
	at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1644)
	at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:257)
	at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228)
	at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
	at org.apache.hive.hcatalog.mapreduce.HCatBaseInputFormat.getSplits(HCatBaseInputFormat.java:157)
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274)
	... 18 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
Permission denied: user=edhadmsvc, access=EXECUTE,
inode="/user/hive/warehouse
<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
	at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
	at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

	at org.apache.hadoop.ipc.Client.call(Client.java:1468)
	at org.apache.hadoop.ipc.Client.call(Client.java:1399)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
	at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
	at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
	at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1982)
	... 30 more
	/tmp/pig_testing001,

Input(s):
Failed to read data from "sandbox.suppliers"

Output(s):
Failed to produce result in "/tmp/pig_testing001
<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/tmp/pig_testing001>"

Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0

Job DAG:
job_1449847448721_0474
<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>


2015-12-30 00:00:53,262 [uber-SubtaskRunner] INFO
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
 - Failed!
2015-12-30 00:00:53,269 [uber-SubtaskRunner] ERROR
org.apache.pig.tools.grunt.GruntParser  - ERROR 2244: Job failed,
hadoop does not return any error message
Hadoop Job IDs executed by Pig: job_1449847448721_0474
<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>





Thanks
Jay

RE: Getting ERROR 2118: Permission denied: while running PIG script using HCATALOG

Posted by Chinnappan Chandrasekaran <ch...@jos.com.sg>.
Try  this

$ chown username:groupname  /user/hive/warehouse/ -R



Thanks & Regards
Chandrasekaran
Technical Consultant
Business Solutions Group

Jardine OneSolution (2001) Pte Ltd
Tel +65 6551 9608 | Mobile +65 8138 4761 | Email chiranchandra@jos.com.sg<ma...@jos.com.sg>
55, Ubi Avenue 1 #03-15,Singapore 408935

[Description: 150828 - JOS email signature_grey-02]<jos.com>

From: Kumar Jayapal [mailto:kjayapal17@gmail.com]
Sent: Wednesday, 30 December, 2015 8:09 AM
To: Hue-Users; user@hadoop.apache.org
Subject: Getting ERROR 2118: Permission denied: while running PIG script using HCATALOG

Hi,

When I run this simple pig script from pig editor in hue I get permission denied error. I can execute queries in hive as the same user any idea why?

We are using sentry for authorisation.


Here is my pig script.


LOAD_TBL_A = LOAD 'sandbox.suppliers' USING org.apache.hive.hcatalog.pig.HCatLoader();

STORE LOAD_TBL_A INTO '/tmp/pig_testing001/';





Apache Pig version 0.12.0-cdh5.4.5 (rexported)
compiled Aug 12 2015, 14:17:24

Run pig script using PigRunner.run() for Pig version 0.8+
2015-12-30 00:00:42,435 [uber-SubtaskRunner] INFO  org.apache.pig.Main  - Apache Pig version 0.12.0-cdh5.4.5 (rexported) compiled Aug 12 2015, 14:17:24
2015-12-30 00:00:42,437 [uber-SubtaskRunner] INFO  org.apache.pig.Main  - Logging error messages to: /mnt/drive08-sdj/nm/usercache/edhadmsvc/appcache/application_1449847448721_0473/container_e64_1449847448721_0473_01_000001/pig-job_1449847448721_0473.log<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/mnt/drive08-sdj/nm/usercache/edhadmsvc/appcache/application_1449847448721_0473/container_e64_1449847448721_0473_01_000001/pig-job_1449847448721_0473.log>
2015-12-30 00:00:42,487 [uber-SubtaskRunner] INFO  org.apache.pig.impl.util.Utils  - Default bootup file /home/edhadmsvc/.pigbootup<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/home/edhadmsvc/.pigbootup> not found
2015-12-30 00:00:42,617 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
2015-12-30 00:00:42,617 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - fs.default.name<http://fs.default.name> is deprecated. Instead, use fs.defaultFS
2015-12-30 00:00:42,617 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.HExecutionEngine  - Connecting to hadoop file system at: hdfs://nameservice1<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/>
2015-12-30 00:00:42,623 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.HExecutionEngine  - Connecting to map-reduce job tracker at: yarnRM
2015-12-30 00:00:42,627 [uber-SubtaskRunner] WARN  org.apache.pig.PigServer  - Empty string specified for jar path
2015-12-30 00:00:43,320 [uber-SubtaskRunner] INFO  hive.metastore  - Trying to connect to metastore with URI thrift://hmscdh01094p001.corp.costco.com:9083<http://hmscdh01094p001.corp.costco.com:9083>
2015-12-30 00:00:43,387 [uber-SubtaskRunner] INFO  hive.metastore  - Opened a connection to metastore, current connections: 1
2015-12-30 00:00:43,388 [uber-SubtaskRunner] INFO  hive.metastore  - Connected to metastore.
2015-12-30 00:00:43,658 [uber-SubtaskRunner] INFO  org.apache.pig.tools.pigstats.ScriptState  - Pig features used in the script: UNKNOWN
2015-12-30 00:00:43,750 [uber-SubtaskRunner] INFO  org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer  - {RULES_ENABLED=[AddForEach, ColumnMapKeyPrune, DuplicateForEachColumnRewrite, GroupByConstParallelSetter, ImplicitSplitInserter, LimitOptimizer, LoadTypeCastInserter, MergeFilter, MergeForEach, NewPartitionFilterOptimizer, PushDownForEachFlatten, PushUpFilter, SplitFilter, StreamTypeCastInserter], RULES_DISABLED=[FilterLogicExpressionSimplifier, PartitionFilterOptimizer]}
2015-12-30 00:00:43,769 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - fs.default.name<http://fs.default.name> is deprecated. Instead, use fs.defaultFS
2015-12-30 00:00:43,772 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.textoutputformat.separator is deprecated. Instead, use mapreduce.output.textoutputformat.separator
2015-12-30 00:00:43,856 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler  - File concatenation threshold: 100 optimistic? false
2015-12-30 00:00:43,921 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer  - MR plan size before optimization: 1
2015-12-30 00:00:43,921 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer  - MR plan size after optimization: 1
2015-12-30 00:00:44,006 [uber-SubtaskRunner] INFO  org.apache.pig.tools.pigstats.ScriptState  - Pig script settings are added to the job
2015-12-30 00:00:44,044 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.job.reduce.markreset.buffer.percent is deprecated. Instead, use mapreduce.reduce.markreset.buffer.percent
2015-12-30 00:00:44,044 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2015-12-30 00:00:44,044 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.output.compress is deprecated. Instead, use mapreduce.output.fileoutputformat.compress
2015-12-30 00:00:44,266 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - fs.default.name<http://fs.default.name> is deprecated. Instead, use fs.defaultFS
2015-12-30 00:00:44,267 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.task.id<http://mapred.task.id> is deprecated. Instead, use mapreduce.task.attempt.id<http://mapreduce.task.attempt.id>
2015-12-30 00:00:44,267 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - creating jar file Job4443028594885224634.jar
2015-12-30 00:00:47,524 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - jar file Job4443028594885224634.jar created
2015-12-30 00:00:47,524 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.jar is deprecated. Instead, use mapreduce.job.jar
2015-12-30 00:00:47,550 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - Setting up single store job
2015-12-30 00:00:47,617 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - 1 map-reduce job(s) waiting for submission.
2015-12-30 00:00:47,618 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.job.tracker.http.address is deprecated. Instead, use mapreduce.jobtracker.http.address
2015-12-30 00:00:47,667 [JobControl] INFO  org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider  - Failing over to rm1199
2015-12-30 00:00:47,929 [communication thread] INFO  org.apache.hadoop.mapred.TaskAttemptListenerImpl  - Progress of TaskAttempt attempt_1449847448721_0473_m_000000_0 is : 1.0
2015-12-30 00:00:48,076 [JobControl] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir
2015-12-30 00:00:48,103 [JobControl] INFO  org.apache.hadoop.mapreduce.JobSubmitter  - Cleaning up the staging area job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/edhadmsvc/.staging/%3Ca%20href=>" target="_blank">/user/edhadmsvc/.staging/job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>
2015-12-30 00:00:48,112 [JobControl] WARN  org.apache.hadoop.security.UserGroupInformation  - PriviledgedActionException as:edhadmsvc (auth:SIMPLE) cause:org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

2015-12-30 00:00:48,113 [JobControl] INFO  org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob  - PigLatin:script.pig got an error while submitting
org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:288)
         at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:597)
         at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:614)
         at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)
         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306)
         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1303)
         at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
         at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:191)
         at java.lang.Thread.run(Thread.java:745)
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:270)
Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
         at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
         at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1984)
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1128)
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1124)
         at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
         at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1124)
         at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
         at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
         at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1644)
         at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:257)
         at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228)
         at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
         at org.apache.hive.hcatalog.mapreduce.HCatBaseInputFormat.getSplits(HCatBaseInputFormat.java:157)
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274)
         ... 18 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.hadoop.ipc.Client.call(Client.java:1468)
         at org.apache.hadoop.ipc.Client.call(Client.java:1399)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
         at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
         at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
         at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1982)
         ... 30 more
2015-12-30 00:00:48,119 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - HadoopJobId: job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>
2015-12-30 00:00:48,120 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - Processing aliases LOAD_TBL_A
2015-12-30 00:00:48,120 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - detailed locations: M: LOAD_TBL_A[1,13] C:  R:
2015-12-30 00:00:48,123 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - 0% complete
2015-12-30 00:00:53,133 [uber-SubtaskRunner] WARN  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - Ooops! Some job has failed! Specify -stop_on_failure if you want Pig to stop immediately on failure.
2015-12-30 00:00:53,133 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - job job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474> has failed! Stop running all dependent jobs
2015-12-30 00:00:53,133 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - 100% complete
2015-12-30 00:00:53,202 [uber-SubtaskRunner] INFO  org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider  - Failing over to rm1199
2015-12-30 00:00:53,207 [uber-SubtaskRunner] INFO  org.apache.hadoop.mapred.ClientServiceDelegate  - Could not get Job info from RM for job job_1449847448721_0474. Redirecting to job history server.
2015-12-30 00:00:53,245 [uber-SubtaskRunner] INFO  org.apache.hadoop.mapred.ClientServiceDelegate  - Could not get Job info from RM for job job_1449847448721_0474. Redirecting to job history server.
2015-12-30 00:00:53,260 [uber-SubtaskRunner] ERROR org.apache.pig.tools.pigstats.PigStatsUtil  - 1 map reduce job(s) failed!
2015-12-30 00:00:53,262 [uber-SubtaskRunner] INFO  org.apache.pig.tools.pigstats.SimplePigStats  - Script Statistics:

HadoopVersion    PigVersion       UserId   StartedAt        FinishedAt        Features
2.6.0-cdh5.4.5   0.12.0-cdh5.4.5  edhadmsvc        2015-12-30 00:00:44       2015-12-30 00:00:53         UNKNOWN

Failed!

Failed Jobs:
JobId    Alias    Feature  Message  Outputs
job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>    LOAD_TBL_A       MAP_ONLY Message: org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:288)
         at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:597)
         at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:614)
         at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)
         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306)
         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1303)
         at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
         at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:191)
         at java.lang.Thread.run(Thread.java:745)
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:270)
Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
         at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
         at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1984)
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1128)
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1124)
         at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
         at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1124)
         at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
         at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
         at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1644)
         at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:257)
         at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228)
         at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
         at org.apache.hive.hcatalog.mapreduce.HCatBaseInputFormat.getSplits(HCatBaseInputFormat.java:157)
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274)
         ... 18 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.hadoop.ipc.Client.call(Client.java:1468)
         at org.apache.hadoop.ipc.Client.call(Client.java:1399)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
         at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
         at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
         at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1982)
         ... 30 more
         /tmp/pig_testing001,

Input(s):
Failed to read data from "sandbox.suppliers"

Output(s):
Failed to produce result in "/tmp/pig_testing001<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/tmp/pig_testing001>"

Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0

Job DAG:
job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>


2015-12-30 00:00:53,262 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - Failed!
2015-12-30 00:00:53,269 [uber-SubtaskRunner] ERROR org.apache.pig.tools.grunt.GruntParser  - ERROR 2244: Job failed, hadoop does not return any error message
Hadoop Job IDs executed by Pig: job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>




Thanks
Jay

______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
For more information please visit http://www.symanteccloud.com
______________________________________________________________________
[http://img.photobucket.com/albums/v232/RiZZo66/WLAchieverAward2014BampW.jpg]
For general enquiries, please contact us at JOS Enquiry Email: enquiry@jos.com.sg<ma...@jos.com.sg> Hotline: (+65) 6551 9611
For JOS Support, please contact us at JOS Services Email: services@jos.com.sg<ma...@jos.com.sg> Hotline: (+65) 6484 2302

A member of the Jardine Matheson Group, Jardine OneSolution is one of Asia’s leading providers of integrated IT services and solutions with offices in Singapore, Malaysia, Hong Kong and China. Find out more about JOS at www.jos.com<http://www.jos.com>

Confidentiality Notice and Disclaimer:
This email (including any attachment to it) is confidential and intended only for the use of the individual or entity named above and may contain information that is privileged. If you are not the intended recipient, you are notified that any dissemination, distribution or copying of this email is strictly prohibited. If you have received this email in error, please notify us immediately by return email or telephone and destroy the original message (including any attachment to it). Thank you.



______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
Confidentiality Notice and Disclaimer: This email (including any attachment to it) is confidential and intended only for the use of the individual or entity named above and may contain information that is privileged.  If you are not the intended recipient, you are notified that any dissemination, distribution or copying of this email is strictly prohibited.  If you have received this email in error, please notify us immediately by return email or telephone and destroy the original message (including any attachment to it).  Thank you.
______________________________________________________________________

RE: Getting ERROR 2118: Permission denied: while running PIG script using HCATALOG

Posted by Chinnappan Chandrasekaran <ch...@jos.com.sg>.
Try  this

$ chown username:groupname  /user/hive/warehouse/ -R



Thanks & Regards
Chandrasekaran
Technical Consultant
Business Solutions Group

Jardine OneSolution (2001) Pte Ltd
Tel +65 6551 9608 | Mobile +65 8138 4761 | Email chiranchandra@jos.com.sg<ma...@jos.com.sg>
55, Ubi Avenue 1 #03-15,Singapore 408935

[Description: 150828 - JOS email signature_grey-02]<jos.com>

From: Kumar Jayapal [mailto:kjayapal17@gmail.com]
Sent: Wednesday, 30 December, 2015 8:09 AM
To: Hue-Users; user@hadoop.apache.org
Subject: Getting ERROR 2118: Permission denied: while running PIG script using HCATALOG

Hi,

When I run this simple pig script from pig editor in hue I get permission denied error. I can execute queries in hive as the same user any idea why?

We are using sentry for authorisation.


Here is my pig script.


LOAD_TBL_A = LOAD 'sandbox.suppliers' USING org.apache.hive.hcatalog.pig.HCatLoader();

STORE LOAD_TBL_A INTO '/tmp/pig_testing001/';





Apache Pig version 0.12.0-cdh5.4.5 (rexported)
compiled Aug 12 2015, 14:17:24

Run pig script using PigRunner.run() for Pig version 0.8+
2015-12-30 00:00:42,435 [uber-SubtaskRunner] INFO  org.apache.pig.Main  - Apache Pig version 0.12.0-cdh5.4.5 (rexported) compiled Aug 12 2015, 14:17:24
2015-12-30 00:00:42,437 [uber-SubtaskRunner] INFO  org.apache.pig.Main  - Logging error messages to: /mnt/drive08-sdj/nm/usercache/edhadmsvc/appcache/application_1449847448721_0473/container_e64_1449847448721_0473_01_000001/pig-job_1449847448721_0473.log<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/mnt/drive08-sdj/nm/usercache/edhadmsvc/appcache/application_1449847448721_0473/container_e64_1449847448721_0473_01_000001/pig-job_1449847448721_0473.log>
2015-12-30 00:00:42,487 [uber-SubtaskRunner] INFO  org.apache.pig.impl.util.Utils  - Default bootup file /home/edhadmsvc/.pigbootup<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/home/edhadmsvc/.pigbootup> not found
2015-12-30 00:00:42,617 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
2015-12-30 00:00:42,617 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - fs.default.name<http://fs.default.name> is deprecated. Instead, use fs.defaultFS
2015-12-30 00:00:42,617 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.HExecutionEngine  - Connecting to hadoop file system at: hdfs://nameservice1<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/>
2015-12-30 00:00:42,623 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.HExecutionEngine  - Connecting to map-reduce job tracker at: yarnRM
2015-12-30 00:00:42,627 [uber-SubtaskRunner] WARN  org.apache.pig.PigServer  - Empty string specified for jar path
2015-12-30 00:00:43,320 [uber-SubtaskRunner] INFO  hive.metastore  - Trying to connect to metastore with URI thrift://hmscdh01094p001.corp.costco.com:9083<http://hmscdh01094p001.corp.costco.com:9083>
2015-12-30 00:00:43,387 [uber-SubtaskRunner] INFO  hive.metastore  - Opened a connection to metastore, current connections: 1
2015-12-30 00:00:43,388 [uber-SubtaskRunner] INFO  hive.metastore  - Connected to metastore.
2015-12-30 00:00:43,658 [uber-SubtaskRunner] INFO  org.apache.pig.tools.pigstats.ScriptState  - Pig features used in the script: UNKNOWN
2015-12-30 00:00:43,750 [uber-SubtaskRunner] INFO  org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer  - {RULES_ENABLED=[AddForEach, ColumnMapKeyPrune, DuplicateForEachColumnRewrite, GroupByConstParallelSetter, ImplicitSplitInserter, LimitOptimizer, LoadTypeCastInserter, MergeFilter, MergeForEach, NewPartitionFilterOptimizer, PushDownForEachFlatten, PushUpFilter, SplitFilter, StreamTypeCastInserter], RULES_DISABLED=[FilterLogicExpressionSimplifier, PartitionFilterOptimizer]}
2015-12-30 00:00:43,769 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - fs.default.name<http://fs.default.name> is deprecated. Instead, use fs.defaultFS
2015-12-30 00:00:43,772 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.textoutputformat.separator is deprecated. Instead, use mapreduce.output.textoutputformat.separator
2015-12-30 00:00:43,856 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler  - File concatenation threshold: 100 optimistic? false
2015-12-30 00:00:43,921 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer  - MR plan size before optimization: 1
2015-12-30 00:00:43,921 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer  - MR plan size after optimization: 1
2015-12-30 00:00:44,006 [uber-SubtaskRunner] INFO  org.apache.pig.tools.pigstats.ScriptState  - Pig script settings are added to the job
2015-12-30 00:00:44,044 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.job.reduce.markreset.buffer.percent is deprecated. Instead, use mapreduce.reduce.markreset.buffer.percent
2015-12-30 00:00:44,044 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2015-12-30 00:00:44,044 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.output.compress is deprecated. Instead, use mapreduce.output.fileoutputformat.compress
2015-12-30 00:00:44,266 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - fs.default.name<http://fs.default.name> is deprecated. Instead, use fs.defaultFS
2015-12-30 00:00:44,267 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.task.id<http://mapred.task.id> is deprecated. Instead, use mapreduce.task.attempt.id<http://mapreduce.task.attempt.id>
2015-12-30 00:00:44,267 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - creating jar file Job4443028594885224634.jar
2015-12-30 00:00:47,524 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - jar file Job4443028594885224634.jar created
2015-12-30 00:00:47,524 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.jar is deprecated. Instead, use mapreduce.job.jar
2015-12-30 00:00:47,550 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - Setting up single store job
2015-12-30 00:00:47,617 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - 1 map-reduce job(s) waiting for submission.
2015-12-30 00:00:47,618 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.job.tracker.http.address is deprecated. Instead, use mapreduce.jobtracker.http.address
2015-12-30 00:00:47,667 [JobControl] INFO  org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider  - Failing over to rm1199
2015-12-30 00:00:47,929 [communication thread] INFO  org.apache.hadoop.mapred.TaskAttemptListenerImpl  - Progress of TaskAttempt attempt_1449847448721_0473_m_000000_0 is : 1.0
2015-12-30 00:00:48,076 [JobControl] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir
2015-12-30 00:00:48,103 [JobControl] INFO  org.apache.hadoop.mapreduce.JobSubmitter  - Cleaning up the staging area job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/edhadmsvc/.staging/%3Ca%20href=>" target="_blank">/user/edhadmsvc/.staging/job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>
2015-12-30 00:00:48,112 [JobControl] WARN  org.apache.hadoop.security.UserGroupInformation  - PriviledgedActionException as:edhadmsvc (auth:SIMPLE) cause:org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

2015-12-30 00:00:48,113 [JobControl] INFO  org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob  - PigLatin:script.pig got an error while submitting
org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:288)
         at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:597)
         at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:614)
         at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)
         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306)
         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1303)
         at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
         at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:191)
         at java.lang.Thread.run(Thread.java:745)
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:270)
Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
         at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
         at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1984)
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1128)
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1124)
         at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
         at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1124)
         at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
         at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
         at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1644)
         at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:257)
         at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228)
         at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
         at org.apache.hive.hcatalog.mapreduce.HCatBaseInputFormat.getSplits(HCatBaseInputFormat.java:157)
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274)
         ... 18 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.hadoop.ipc.Client.call(Client.java:1468)
         at org.apache.hadoop.ipc.Client.call(Client.java:1399)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
         at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
         at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
         at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1982)
         ... 30 more
2015-12-30 00:00:48,119 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - HadoopJobId: job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>
2015-12-30 00:00:48,120 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - Processing aliases LOAD_TBL_A
2015-12-30 00:00:48,120 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - detailed locations: M: LOAD_TBL_A[1,13] C:  R:
2015-12-30 00:00:48,123 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - 0% complete
2015-12-30 00:00:53,133 [uber-SubtaskRunner] WARN  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - Ooops! Some job has failed! Specify -stop_on_failure if you want Pig to stop immediately on failure.
2015-12-30 00:00:53,133 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - job job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474> has failed! Stop running all dependent jobs
2015-12-30 00:00:53,133 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - 100% complete
2015-12-30 00:00:53,202 [uber-SubtaskRunner] INFO  org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider  - Failing over to rm1199
2015-12-30 00:00:53,207 [uber-SubtaskRunner] INFO  org.apache.hadoop.mapred.ClientServiceDelegate  - Could not get Job info from RM for job job_1449847448721_0474. Redirecting to job history server.
2015-12-30 00:00:53,245 [uber-SubtaskRunner] INFO  org.apache.hadoop.mapred.ClientServiceDelegate  - Could not get Job info from RM for job job_1449847448721_0474. Redirecting to job history server.
2015-12-30 00:00:53,260 [uber-SubtaskRunner] ERROR org.apache.pig.tools.pigstats.PigStatsUtil  - 1 map reduce job(s) failed!
2015-12-30 00:00:53,262 [uber-SubtaskRunner] INFO  org.apache.pig.tools.pigstats.SimplePigStats  - Script Statistics:

HadoopVersion    PigVersion       UserId   StartedAt        FinishedAt        Features
2.6.0-cdh5.4.5   0.12.0-cdh5.4.5  edhadmsvc        2015-12-30 00:00:44       2015-12-30 00:00:53         UNKNOWN

Failed!

Failed Jobs:
JobId    Alias    Feature  Message  Outputs
job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>    LOAD_TBL_A       MAP_ONLY Message: org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:288)
         at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:597)
         at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:614)
         at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)
         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306)
         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1303)
         at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
         at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:191)
         at java.lang.Thread.run(Thread.java:745)
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:270)
Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
         at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
         at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1984)
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1128)
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1124)
         at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
         at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1124)
         at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
         at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
         at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1644)
         at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:257)
         at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228)
         at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
         at org.apache.hive.hcatalog.mapreduce.HCatBaseInputFormat.getSplits(HCatBaseInputFormat.java:157)
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274)
         ... 18 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.hadoop.ipc.Client.call(Client.java:1468)
         at org.apache.hadoop.ipc.Client.call(Client.java:1399)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
         at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
         at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
         at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1982)
         ... 30 more
         /tmp/pig_testing001,

Input(s):
Failed to read data from "sandbox.suppliers"

Output(s):
Failed to produce result in "/tmp/pig_testing001<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/tmp/pig_testing001>"

Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0

Job DAG:
job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>


2015-12-30 00:00:53,262 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - Failed!
2015-12-30 00:00:53,269 [uber-SubtaskRunner] ERROR org.apache.pig.tools.grunt.GruntParser  - ERROR 2244: Job failed, hadoop does not return any error message
Hadoop Job IDs executed by Pig: job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>




Thanks
Jay

______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
For more information please visit http://www.symanteccloud.com
______________________________________________________________________
[http://img.photobucket.com/albums/v232/RiZZo66/WLAchieverAward2014BampW.jpg]
For general enquiries, please contact us at JOS Enquiry Email: enquiry@jos.com.sg<ma...@jos.com.sg> Hotline: (+65) 6551 9611
For JOS Support, please contact us at JOS Services Email: services@jos.com.sg<ma...@jos.com.sg> Hotline: (+65) 6484 2302

A member of the Jardine Matheson Group, Jardine OneSolution is one of Asia’s leading providers of integrated IT services and solutions with offices in Singapore, Malaysia, Hong Kong and China. Find out more about JOS at www.jos.com<http://www.jos.com>

Confidentiality Notice and Disclaimer:
This email (including any attachment to it) is confidential and intended only for the use of the individual or entity named above and may contain information that is privileged. If you are not the intended recipient, you are notified that any dissemination, distribution or copying of this email is strictly prohibited. If you have received this email in error, please notify us immediately by return email or telephone and destroy the original message (including any attachment to it). Thank you.



______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
Confidentiality Notice and Disclaimer: This email (including any attachment to it) is confidential and intended only for the use of the individual or entity named above and may contain information that is privileged.  If you are not the intended recipient, you are notified that any dissemination, distribution or copying of this email is strictly prohibited.  If you have received this email in error, please notify us immediately by return email or telephone and destroy the original message (including any attachment to it).  Thank you.
______________________________________________________________________

RE: Getting ERROR 2118: Permission denied: while running PIG script using HCATALOG

Posted by Chinnappan Chandrasekaran <ch...@jos.com.sg>.
Try  this

$ chown username:groupname  /user/hive/warehouse/ -R



Thanks & Regards
Chandrasekaran
Technical Consultant
Business Solutions Group

Jardine OneSolution (2001) Pte Ltd
Tel +65 6551 9608 | Mobile +65 8138 4761 | Email chiranchandra@jos.com.sg<ma...@jos.com.sg>
55, Ubi Avenue 1 #03-15,Singapore 408935

[Description: 150828 - JOS email signature_grey-02]<jos.com>

From: Kumar Jayapal [mailto:kjayapal17@gmail.com]
Sent: Wednesday, 30 December, 2015 8:09 AM
To: Hue-Users; user@hadoop.apache.org
Subject: Getting ERROR 2118: Permission denied: while running PIG script using HCATALOG

Hi,

When I run this simple pig script from pig editor in hue I get permission denied error. I can execute queries in hive as the same user any idea why?

We are using sentry for authorisation.


Here is my pig script.


LOAD_TBL_A = LOAD 'sandbox.suppliers' USING org.apache.hive.hcatalog.pig.HCatLoader();

STORE LOAD_TBL_A INTO '/tmp/pig_testing001/';





Apache Pig version 0.12.0-cdh5.4.5 (rexported)
compiled Aug 12 2015, 14:17:24

Run pig script using PigRunner.run() for Pig version 0.8+
2015-12-30 00:00:42,435 [uber-SubtaskRunner] INFO  org.apache.pig.Main  - Apache Pig version 0.12.0-cdh5.4.5 (rexported) compiled Aug 12 2015, 14:17:24
2015-12-30 00:00:42,437 [uber-SubtaskRunner] INFO  org.apache.pig.Main  - Logging error messages to: /mnt/drive08-sdj/nm/usercache/edhadmsvc/appcache/application_1449847448721_0473/container_e64_1449847448721_0473_01_000001/pig-job_1449847448721_0473.log<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/mnt/drive08-sdj/nm/usercache/edhadmsvc/appcache/application_1449847448721_0473/container_e64_1449847448721_0473_01_000001/pig-job_1449847448721_0473.log>
2015-12-30 00:00:42,487 [uber-SubtaskRunner] INFO  org.apache.pig.impl.util.Utils  - Default bootup file /home/edhadmsvc/.pigbootup<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/home/edhadmsvc/.pigbootup> not found
2015-12-30 00:00:42,617 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
2015-12-30 00:00:42,617 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - fs.default.name<http://fs.default.name> is deprecated. Instead, use fs.defaultFS
2015-12-30 00:00:42,617 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.HExecutionEngine  - Connecting to hadoop file system at: hdfs://nameservice1<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/>
2015-12-30 00:00:42,623 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.HExecutionEngine  - Connecting to map-reduce job tracker at: yarnRM
2015-12-30 00:00:42,627 [uber-SubtaskRunner] WARN  org.apache.pig.PigServer  - Empty string specified for jar path
2015-12-30 00:00:43,320 [uber-SubtaskRunner] INFO  hive.metastore  - Trying to connect to metastore with URI thrift://hmscdh01094p001.corp.costco.com:9083<http://hmscdh01094p001.corp.costco.com:9083>
2015-12-30 00:00:43,387 [uber-SubtaskRunner] INFO  hive.metastore  - Opened a connection to metastore, current connections: 1
2015-12-30 00:00:43,388 [uber-SubtaskRunner] INFO  hive.metastore  - Connected to metastore.
2015-12-30 00:00:43,658 [uber-SubtaskRunner] INFO  org.apache.pig.tools.pigstats.ScriptState  - Pig features used in the script: UNKNOWN
2015-12-30 00:00:43,750 [uber-SubtaskRunner] INFO  org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer  - {RULES_ENABLED=[AddForEach, ColumnMapKeyPrune, DuplicateForEachColumnRewrite, GroupByConstParallelSetter, ImplicitSplitInserter, LimitOptimizer, LoadTypeCastInserter, MergeFilter, MergeForEach, NewPartitionFilterOptimizer, PushDownForEachFlatten, PushUpFilter, SplitFilter, StreamTypeCastInserter], RULES_DISABLED=[FilterLogicExpressionSimplifier, PartitionFilterOptimizer]}
2015-12-30 00:00:43,769 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - fs.default.name<http://fs.default.name> is deprecated. Instead, use fs.defaultFS
2015-12-30 00:00:43,772 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.textoutputformat.separator is deprecated. Instead, use mapreduce.output.textoutputformat.separator
2015-12-30 00:00:43,856 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler  - File concatenation threshold: 100 optimistic? false
2015-12-30 00:00:43,921 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer  - MR plan size before optimization: 1
2015-12-30 00:00:43,921 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer  - MR plan size after optimization: 1
2015-12-30 00:00:44,006 [uber-SubtaskRunner] INFO  org.apache.pig.tools.pigstats.ScriptState  - Pig script settings are added to the job
2015-12-30 00:00:44,044 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.job.reduce.markreset.buffer.percent is deprecated. Instead, use mapreduce.reduce.markreset.buffer.percent
2015-12-30 00:00:44,044 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2015-12-30 00:00:44,044 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.output.compress is deprecated. Instead, use mapreduce.output.fileoutputformat.compress
2015-12-30 00:00:44,266 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - fs.default.name<http://fs.default.name> is deprecated. Instead, use fs.defaultFS
2015-12-30 00:00:44,267 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.task.id<http://mapred.task.id> is deprecated. Instead, use mapreduce.task.attempt.id<http://mapreduce.task.attempt.id>
2015-12-30 00:00:44,267 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - creating jar file Job4443028594885224634.jar
2015-12-30 00:00:47,524 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - jar file Job4443028594885224634.jar created
2015-12-30 00:00:47,524 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.jar is deprecated. Instead, use mapreduce.job.jar
2015-12-30 00:00:47,550 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - Setting up single store job
2015-12-30 00:00:47,617 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - 1 map-reduce job(s) waiting for submission.
2015-12-30 00:00:47,618 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.job.tracker.http.address is deprecated. Instead, use mapreduce.jobtracker.http.address
2015-12-30 00:00:47,667 [JobControl] INFO  org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider  - Failing over to rm1199
2015-12-30 00:00:47,929 [communication thread] INFO  org.apache.hadoop.mapred.TaskAttemptListenerImpl  - Progress of TaskAttempt attempt_1449847448721_0473_m_000000_0 is : 1.0
2015-12-30 00:00:48,076 [JobControl] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir
2015-12-30 00:00:48,103 [JobControl] INFO  org.apache.hadoop.mapreduce.JobSubmitter  - Cleaning up the staging area job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/edhadmsvc/.staging/%3Ca%20href=>" target="_blank">/user/edhadmsvc/.staging/job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>
2015-12-30 00:00:48,112 [JobControl] WARN  org.apache.hadoop.security.UserGroupInformation  - PriviledgedActionException as:edhadmsvc (auth:SIMPLE) cause:org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

2015-12-30 00:00:48,113 [JobControl] INFO  org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob  - PigLatin:script.pig got an error while submitting
org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:288)
         at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:597)
         at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:614)
         at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)
         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306)
         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1303)
         at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
         at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:191)
         at java.lang.Thread.run(Thread.java:745)
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:270)
Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
         at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
         at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1984)
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1128)
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1124)
         at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
         at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1124)
         at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
         at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
         at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1644)
         at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:257)
         at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228)
         at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
         at org.apache.hive.hcatalog.mapreduce.HCatBaseInputFormat.getSplits(HCatBaseInputFormat.java:157)
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274)
         ... 18 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.hadoop.ipc.Client.call(Client.java:1468)
         at org.apache.hadoop.ipc.Client.call(Client.java:1399)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
         at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
         at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
         at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1982)
         ... 30 more
2015-12-30 00:00:48,119 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - HadoopJobId: job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>
2015-12-30 00:00:48,120 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - Processing aliases LOAD_TBL_A
2015-12-30 00:00:48,120 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - detailed locations: M: LOAD_TBL_A[1,13] C:  R:
2015-12-30 00:00:48,123 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - 0% complete
2015-12-30 00:00:53,133 [uber-SubtaskRunner] WARN  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - Ooops! Some job has failed! Specify -stop_on_failure if you want Pig to stop immediately on failure.
2015-12-30 00:00:53,133 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - job job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474> has failed! Stop running all dependent jobs
2015-12-30 00:00:53,133 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - 100% complete
2015-12-30 00:00:53,202 [uber-SubtaskRunner] INFO  org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider  - Failing over to rm1199
2015-12-30 00:00:53,207 [uber-SubtaskRunner] INFO  org.apache.hadoop.mapred.ClientServiceDelegate  - Could not get Job info from RM for job job_1449847448721_0474. Redirecting to job history server.
2015-12-30 00:00:53,245 [uber-SubtaskRunner] INFO  org.apache.hadoop.mapred.ClientServiceDelegate  - Could not get Job info from RM for job job_1449847448721_0474. Redirecting to job history server.
2015-12-30 00:00:53,260 [uber-SubtaskRunner] ERROR org.apache.pig.tools.pigstats.PigStatsUtil  - 1 map reduce job(s) failed!
2015-12-30 00:00:53,262 [uber-SubtaskRunner] INFO  org.apache.pig.tools.pigstats.SimplePigStats  - Script Statistics:

HadoopVersion    PigVersion       UserId   StartedAt        FinishedAt        Features
2.6.0-cdh5.4.5   0.12.0-cdh5.4.5  edhadmsvc        2015-12-30 00:00:44       2015-12-30 00:00:53         UNKNOWN

Failed!

Failed Jobs:
JobId    Alias    Feature  Message  Outputs
job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>    LOAD_TBL_A       MAP_ONLY Message: org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:288)
         at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:597)
         at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:614)
         at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)
         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306)
         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1303)
         at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
         at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:191)
         at java.lang.Thread.run(Thread.java:745)
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:270)
Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
         at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
         at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1984)
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1128)
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1124)
         at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
         at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1124)
         at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
         at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
         at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1644)
         at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:257)
         at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228)
         at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
         at org.apache.hive.hcatalog.mapreduce.HCatBaseInputFormat.getSplits(HCatBaseInputFormat.java:157)
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274)
         ... 18 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.hadoop.ipc.Client.call(Client.java:1468)
         at org.apache.hadoop.ipc.Client.call(Client.java:1399)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
         at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
         at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
         at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1982)
         ... 30 more
         /tmp/pig_testing001,

Input(s):
Failed to read data from "sandbox.suppliers"

Output(s):
Failed to produce result in "/tmp/pig_testing001<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/tmp/pig_testing001>"

Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0

Job DAG:
job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>


2015-12-30 00:00:53,262 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - Failed!
2015-12-30 00:00:53,269 [uber-SubtaskRunner] ERROR org.apache.pig.tools.grunt.GruntParser  - ERROR 2244: Job failed, hadoop does not return any error message
Hadoop Job IDs executed by Pig: job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>




Thanks
Jay

______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
For more information please visit http://www.symanteccloud.com
______________________________________________________________________
[http://img.photobucket.com/albums/v232/RiZZo66/WLAchieverAward2014BampW.jpg]
For general enquiries, please contact us at JOS Enquiry Email: enquiry@jos.com.sg<ma...@jos.com.sg> Hotline: (+65) 6551 9611
For JOS Support, please contact us at JOS Services Email: services@jos.com.sg<ma...@jos.com.sg> Hotline: (+65) 6484 2302

A member of the Jardine Matheson Group, Jardine OneSolution is one of Asia’s leading providers of integrated IT services and solutions with offices in Singapore, Malaysia, Hong Kong and China. Find out more about JOS at www.jos.com<http://www.jos.com>

Confidentiality Notice and Disclaimer:
This email (including any attachment to it) is confidential and intended only for the use of the individual or entity named above and may contain information that is privileged. If you are not the intended recipient, you are notified that any dissemination, distribution or copying of this email is strictly prohibited. If you have received this email in error, please notify us immediately by return email or telephone and destroy the original message (including any attachment to it). Thank you.



______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
Confidentiality Notice and Disclaimer: This email (including any attachment to it) is confidential and intended only for the use of the individual or entity named above and may contain information that is privileged.  If you are not the intended recipient, you are notified that any dissemination, distribution or copying of this email is strictly prohibited.  If you have received this email in error, please notify us immediately by return email or telephone and destroy the original message (including any attachment to it).  Thank you.
______________________________________________________________________

RE: Getting ERROR 2118: Permission denied: while running PIG script using HCATALOG

Posted by Chinnappan Chandrasekaran <ch...@jos.com.sg>.
Try  this

$ chown username:groupname  /user/hive/warehouse/ -R



Thanks & Regards
Chandrasekaran
Technical Consultant
Business Solutions Group

Jardine OneSolution (2001) Pte Ltd
Tel +65 6551 9608 | Mobile +65 8138 4761 | Email chiranchandra@jos.com.sg<ma...@jos.com.sg>
55, Ubi Avenue 1 #03-15,Singapore 408935

[Description: 150828 - JOS email signature_grey-02]<jos.com>

From: Kumar Jayapal [mailto:kjayapal17@gmail.com]
Sent: Wednesday, 30 December, 2015 8:09 AM
To: Hue-Users; user@hadoop.apache.org
Subject: Getting ERROR 2118: Permission denied: while running PIG script using HCATALOG

Hi,

When I run this simple pig script from pig editor in hue I get permission denied error. I can execute queries in hive as the same user any idea why?

We are using sentry for authorisation.


Here is my pig script.


LOAD_TBL_A = LOAD 'sandbox.suppliers' USING org.apache.hive.hcatalog.pig.HCatLoader();

STORE LOAD_TBL_A INTO '/tmp/pig_testing001/';





Apache Pig version 0.12.0-cdh5.4.5 (rexported)
compiled Aug 12 2015, 14:17:24

Run pig script using PigRunner.run() for Pig version 0.8+
2015-12-30 00:00:42,435 [uber-SubtaskRunner] INFO  org.apache.pig.Main  - Apache Pig version 0.12.0-cdh5.4.5 (rexported) compiled Aug 12 2015, 14:17:24
2015-12-30 00:00:42,437 [uber-SubtaskRunner] INFO  org.apache.pig.Main  - Logging error messages to: /mnt/drive08-sdj/nm/usercache/edhadmsvc/appcache/application_1449847448721_0473/container_e64_1449847448721_0473_01_000001/pig-job_1449847448721_0473.log<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/mnt/drive08-sdj/nm/usercache/edhadmsvc/appcache/application_1449847448721_0473/container_e64_1449847448721_0473_01_000001/pig-job_1449847448721_0473.log>
2015-12-30 00:00:42,487 [uber-SubtaskRunner] INFO  org.apache.pig.impl.util.Utils  - Default bootup file /home/edhadmsvc/.pigbootup<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/home/edhadmsvc/.pigbootup> not found
2015-12-30 00:00:42,617 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
2015-12-30 00:00:42,617 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - fs.default.name<http://fs.default.name> is deprecated. Instead, use fs.defaultFS
2015-12-30 00:00:42,617 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.HExecutionEngine  - Connecting to hadoop file system at: hdfs://nameservice1<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/>
2015-12-30 00:00:42,623 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.HExecutionEngine  - Connecting to map-reduce job tracker at: yarnRM
2015-12-30 00:00:42,627 [uber-SubtaskRunner] WARN  org.apache.pig.PigServer  - Empty string specified for jar path
2015-12-30 00:00:43,320 [uber-SubtaskRunner] INFO  hive.metastore  - Trying to connect to metastore with URI thrift://hmscdh01094p001.corp.costco.com:9083<http://hmscdh01094p001.corp.costco.com:9083>
2015-12-30 00:00:43,387 [uber-SubtaskRunner] INFO  hive.metastore  - Opened a connection to metastore, current connections: 1
2015-12-30 00:00:43,388 [uber-SubtaskRunner] INFO  hive.metastore  - Connected to metastore.
2015-12-30 00:00:43,658 [uber-SubtaskRunner] INFO  org.apache.pig.tools.pigstats.ScriptState  - Pig features used in the script: UNKNOWN
2015-12-30 00:00:43,750 [uber-SubtaskRunner] INFO  org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer  - {RULES_ENABLED=[AddForEach, ColumnMapKeyPrune, DuplicateForEachColumnRewrite, GroupByConstParallelSetter, ImplicitSplitInserter, LimitOptimizer, LoadTypeCastInserter, MergeFilter, MergeForEach, NewPartitionFilterOptimizer, PushDownForEachFlatten, PushUpFilter, SplitFilter, StreamTypeCastInserter], RULES_DISABLED=[FilterLogicExpressionSimplifier, PartitionFilterOptimizer]}
2015-12-30 00:00:43,769 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - fs.default.name<http://fs.default.name> is deprecated. Instead, use fs.defaultFS
2015-12-30 00:00:43,772 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.textoutputformat.separator is deprecated. Instead, use mapreduce.output.textoutputformat.separator
2015-12-30 00:00:43,856 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler  - File concatenation threshold: 100 optimistic? false
2015-12-30 00:00:43,921 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer  - MR plan size before optimization: 1
2015-12-30 00:00:43,921 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer  - MR plan size after optimization: 1
2015-12-30 00:00:44,006 [uber-SubtaskRunner] INFO  org.apache.pig.tools.pigstats.ScriptState  - Pig script settings are added to the job
2015-12-30 00:00:44,044 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.job.reduce.markreset.buffer.percent is deprecated. Instead, use mapreduce.reduce.markreset.buffer.percent
2015-12-30 00:00:44,044 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2015-12-30 00:00:44,044 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.output.compress is deprecated. Instead, use mapreduce.output.fileoutputformat.compress
2015-12-30 00:00:44,266 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - fs.default.name<http://fs.default.name> is deprecated. Instead, use fs.defaultFS
2015-12-30 00:00:44,267 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.task.id<http://mapred.task.id> is deprecated. Instead, use mapreduce.task.attempt.id<http://mapreduce.task.attempt.id>
2015-12-30 00:00:44,267 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - creating jar file Job4443028594885224634.jar
2015-12-30 00:00:47,524 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - jar file Job4443028594885224634.jar created
2015-12-30 00:00:47,524 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.jar is deprecated. Instead, use mapreduce.job.jar
2015-12-30 00:00:47,550 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler  - Setting up single store job
2015-12-30 00:00:47,617 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - 1 map-reduce job(s) waiting for submission.
2015-12-30 00:00:47,618 [uber-SubtaskRunner] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.job.tracker.http.address is deprecated. Instead, use mapreduce.jobtracker.http.address
2015-12-30 00:00:47,667 [JobControl] INFO  org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider  - Failing over to rm1199
2015-12-30 00:00:47,929 [communication thread] INFO  org.apache.hadoop.mapred.TaskAttemptListenerImpl  - Progress of TaskAttempt attempt_1449847448721_0473_m_000000_0 is : 1.0
2015-12-30 00:00:48,076 [JobControl] INFO  org.apache.hadoop.conf.Configuration.deprecation  - mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir
2015-12-30 00:00:48,103 [JobControl] INFO  org.apache.hadoop.mapreduce.JobSubmitter  - Cleaning up the staging area job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/edhadmsvc/.staging/%3Ca%20href=>" target="_blank">/user/edhadmsvc/.staging/job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>
2015-12-30 00:00:48,112 [JobControl] WARN  org.apache.hadoop.security.UserGroupInformation  - PriviledgedActionException as:edhadmsvc (auth:SIMPLE) cause:org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

2015-12-30 00:00:48,113 [JobControl] INFO  org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob  - PigLatin:script.pig got an error while submitting
org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:288)
         at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:597)
         at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:614)
         at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)
         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306)
         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1303)
         at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
         at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:191)
         at java.lang.Thread.run(Thread.java:745)
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:270)
Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
         at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
         at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1984)
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1128)
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1124)
         at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
         at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1124)
         at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
         at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
         at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1644)
         at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:257)
         at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228)
         at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
         at org.apache.hive.hcatalog.mapreduce.HCatBaseInputFormat.getSplits(HCatBaseInputFormat.java:157)
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274)
         ... 18 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.hadoop.ipc.Client.call(Client.java:1468)
         at org.apache.hadoop.ipc.Client.call(Client.java:1399)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
         at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
         at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
         at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1982)
         ... 30 more
2015-12-30 00:00:48,119 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - HadoopJobId: job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>
2015-12-30 00:00:48,120 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - Processing aliases LOAD_TBL_A
2015-12-30 00:00:48,120 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - detailed locations: M: LOAD_TBL_A[1,13] C:  R:
2015-12-30 00:00:48,123 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - 0% complete
2015-12-30 00:00:53,133 [uber-SubtaskRunner] WARN  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - Ooops! Some job has failed! Specify -stop_on_failure if you want Pig to stop immediately on failure.
2015-12-30 00:00:53,133 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - job job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474> has failed! Stop running all dependent jobs
2015-12-30 00:00:53,133 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - 100% complete
2015-12-30 00:00:53,202 [uber-SubtaskRunner] INFO  org.apache.hadoop.yarn.client.ConfiguredRMFailoverProxyProvider  - Failing over to rm1199
2015-12-30 00:00:53,207 [uber-SubtaskRunner] INFO  org.apache.hadoop.mapred.ClientServiceDelegate  - Could not get Job info from RM for job job_1449847448721_0474. Redirecting to job history server.
2015-12-30 00:00:53,245 [uber-SubtaskRunner] INFO  org.apache.hadoop.mapred.ClientServiceDelegate  - Could not get Job info from RM for job job_1449847448721_0474. Redirecting to job history server.
2015-12-30 00:00:53,260 [uber-SubtaskRunner] ERROR org.apache.pig.tools.pigstats.PigStatsUtil  - 1 map reduce job(s) failed!
2015-12-30 00:00:53,262 [uber-SubtaskRunner] INFO  org.apache.pig.tools.pigstats.SimplePigStats  - Script Statistics:

HadoopVersion    PigVersion       UserId   StartedAt        FinishedAt        Features
2.6.0-cdh5.4.5   0.12.0-cdh5.4.5  edhadmsvc        2015-12-30 00:00:44       2015-12-30 00:00:53         UNKNOWN

Failed!

Failed Jobs:
JobId    Alias    Feature  Message  Outputs
job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>    LOAD_TBL_A       MAP_ONLY Message: org.apache.pig.backend.executionengine.ExecException: ERROR 2118: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:288)
         at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:597)
         at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:614)
         at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:492)
         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306)
         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1303)
         at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)
         at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:191)
         at java.lang.Thread.run(Thread.java:745)
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:270)
Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
         at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
         at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
         at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
         at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1984)
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1128)
         at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1124)
         at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
         at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1124)
         at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
         at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
         at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1644)
         at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:257)
         at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228)
         at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
         at org.apache.hive.hcatalog.mapreduce.HCatBaseInputFormat.getSplits(HCatBaseInputFormat.java:157)
         at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274)
         ... 18 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=edhadmsvc, access=EXECUTE, inode="/user/hive/warehouse<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/user/hive/warehouse>":hive:hive:drwxrwx---
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
         at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
         at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
         at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:4229)
         at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:881)
         at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getFileInfo(AuthorizationProviderProxyClientProtocol.java:526)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:822)
         at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:415)
         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)

         at org.apache.hadoop.ipc.Client.call(Client.java:1468)
         at org.apache.hadoop.ipc.Client.call(Client.java:1399)
         at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
         at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
         at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:606)
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
         at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
         at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
         at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1982)
         ... 30 more
         /tmp/pig_testing001,

Input(s):
Failed to read data from "sandbox.suppliers"

Output(s):
Failed to produce result in "/tmp/pig_testing001<https://huecdh01094p001.corp.costco.com:8888/filebrowser/view/tmp/pig_testing001>"

Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0

Job DAG:
job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>


2015-12-30 00:00:53,262 [uber-SubtaskRunner] INFO  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher  - Failed!
2015-12-30 00:00:53,269 [uber-SubtaskRunner] ERROR org.apache.pig.tools.grunt.GruntParser  - ERROR 2244: Job failed, hadoop does not return any error message
Hadoop Job IDs executed by Pig: job_1449847448721_0474<https://huecdh01094p001.corp.costco.com:8888/jobbrowser/jobs/job_1449847448721_0474>




Thanks
Jay

______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
For more information please visit http://www.symanteccloud.com
______________________________________________________________________
[http://img.photobucket.com/albums/v232/RiZZo66/WLAchieverAward2014BampW.jpg]
For general enquiries, please contact us at JOS Enquiry Email: enquiry@jos.com.sg<ma...@jos.com.sg> Hotline: (+65) 6551 9611
For JOS Support, please contact us at JOS Services Email: services@jos.com.sg<ma...@jos.com.sg> Hotline: (+65) 6484 2302

A member of the Jardine Matheson Group, Jardine OneSolution is one of Asia’s leading providers of integrated IT services and solutions with offices in Singapore, Malaysia, Hong Kong and China. Find out more about JOS at www.jos.com<http://www.jos.com>

Confidentiality Notice and Disclaimer:
This email (including any attachment to it) is confidential and intended only for the use of the individual or entity named above and may contain information that is privileged. If you are not the intended recipient, you are notified that any dissemination, distribution or copying of this email is strictly prohibited. If you have received this email in error, please notify us immediately by return email or telephone and destroy the original message (including any attachment to it). Thank you.



______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
Confidentiality Notice and Disclaimer: This email (including any attachment to it) is confidential and intended only for the use of the individual or entity named above and may contain information that is privileged.  If you are not the intended recipient, you are notified that any dissemination, distribution or copying of this email is strictly prohibited.  If you have received this email in error, please notify us immediately by return email or telephone and destroy the original message (including any attachment to it).  Thank you.
______________________________________________________________________