You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Mohit Sabharwal (JIRA)" <ji...@apache.org> on 2016/05/03 07:38:12 UTC
[jira] [Updated] (HIVE-13657) Spark driver stderr logs should
appear in hive client logs
[ https://issues.apache.org/jira/browse/HIVE-13657?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Mohit Sabharwal updated HIVE-13657:
-----------------------------------
Status: Patch Available (was: Open)
> Spark driver stderr logs should appear in hive client logs
> ----------------------------------------------------------
>
> Key: HIVE-13657
> URL: https://issues.apache.org/jira/browse/HIVE-13657
> Project: Hive
> Issue Type: Bug
> Reporter: Mohit Sabharwal
> Assignee: Mohit Sabharwal
> Attachments: HIVE-13657.patch
>
>
> Currently, spark driver exceptions are not getting logged in beeline. Instead, the users sees the not-so-useful:
> {code}
> ERROR : Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)'
> <huge stack trace ommitted>
> {code}
> The user has to look at HS2 logs to discover the root cause:
> {code}
> 2015-04-01 11:33:16,048 INFO org.apache.hive.spark.client.SparkClientImpl: 15/04/01 11:33:16 WARN UserGroupInformation: PriviledgedActionException as:foo (auth:PROXY) via hive (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=foo, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x
> ...
> {code}
> We should surface these critical errors in hive client.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)