You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/12/07 08:57:11 UTC

[jira] [Commented] (SPARK-12166) Unset hadoop related environment in testing

    [ https://issues.apache.org/jira/browse/SPARK-12166?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15044531#comment-15044531 ] 

Apache Spark commented on SPARK-12166:
--------------------------------------

User 'zjffdu' has created a pull request for this issue:
https://github.com/apache/spark/pull/10172

> Unset hadoop related environment in testing 
> --------------------------------------------
>
>                 Key: SPARK-12166
>                 URL: https://issues.apache.org/jira/browse/SPARK-12166
>             Project: Spark
>          Issue Type: Improvement
>          Components: Tests
>    Affects Versions: 1.5.2
>            Reporter: Jeff Zhang
>            Priority: Minor
>
> I try to do test on HiveSparkSubmitSuite on local box, but fails. The cause is that spark is still using my local single node cluster hadoop when doing the unit test. I don't think it make sense to do that. These environment variable should be unset before the testing. And I suspect dev/run-tests also
> didn't do that either. 
> Here's the error message:
> {code}
> Cause: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxr-xr-x
> [info]   at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
> [info]   at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171)
> [info]   at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:162)
> [info]   at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:160)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org