You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2018/06/26 01:49:00 UTC

[jira] [Assigned] (SPARK-23776) pyspark-sql tests should display build instructions when components are missing

     [ https://issues.apache.org/jira/browse/SPARK-23776?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon reassigned SPARK-23776:
------------------------------------

    Assignee: Bruce Robbins

> pyspark-sql tests should display build instructions when components are missing
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-23776
>                 URL: https://issues.apache.org/jira/browse/SPARK-23776
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 2.3.0
>            Reporter: Bruce Robbins
>            Assignee: Bruce Robbins
>            Priority: Minor
>             Fix For: 2.4.0
>
>
> This is a follow up to SPARK-23417.
> The pyspark-streaming tests print useful build instructions when certain components are missing in the build.
> pyspark-sql's udf and readwrite tests also have specific build requirements: the build must compile test scala files, and the build must also create the Hive assembly. When those class or jar files are not created, the tests throw only partially helpful exceptions, e.g.:
> {noformat}
> AnalysisException: u'Can not load class test.org.apache.spark.sql.JavaStringLength, please make sure it is on the classpath;'
> {noformat}
> or
> {noformat}
> IllegalArgumentException: u"Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':"
> {noformat}
> You end up in this situation when you follow Spark's build instructions and then attempt to run the pyspark tests.
> It would be nice if pyspark-sql tests provide helpful build instructions in these cases.
>   



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org