You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shivaram Venkataraman (JIRA)" <ji...@apache.org> on 2015/07/01 06:56:04 UTC

[jira] [Commented] (SPARK-8742) Improve SparkR error messages for DataFrame API

    [ https://issues.apache.org/jira/browse/SPARK-8742?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14609558#comment-14609558 ] 

Shivaram Venkataraman commented on SPARK-8742:
----------------------------------------------

Thanks [~falaki] for creating this. This is a pretty important issue and I think there might be a bunch of things to improve here.

I think the most important thing is to filter out the Netty stack trace that comes from the RBackend handler. Typically the netty server throws an error when some other Java function call has failed and the error is rarely in the Netty call itself.  One way to do this might be to return an string message that encodes part of the actual exception when the return status is zero. 


> Improve SparkR error messages for DataFrame API
> -----------------------------------------------
>
>                 Key: SPARK-8742
>                 URL: https://issues.apache.org/jira/browse/SPARK-8742
>             Project: Spark
>          Issue Type: Bug
>          Components: SparkR
>    Affects Versions: 1.4.1
>            Reporter: Hossein Falaki
>            Priority: Blocker
>
> Currently all DataFrame API errors result in following generic error:
> {code}
> Error: returnStatus == 0 is not TRUE
> {code}
> This is because invokeJava in backend.R does not inspect error messages. For most use cases it is critical to return better error messages. Initially, we can return the stack trace from the JVM. In future we can inspect the errors and translate them to human-readable error messages.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org