You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael Armbrust (JIRA)" <ji...@apache.org> on 2014/08/20 22:29:23 UTC
[jira] [Resolved] (SPARK-3062) ShutdownHookManager is only
available in Hadoop 2.x
[ https://issues.apache.org/jira/browse/SPARK-3062?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Michael Armbrust resolved SPARK-3062.
-------------------------------------
Resolution: Fixed
Fix Version/s: 1.1.0
> ShutdownHookManager is only available in Hadoop 2.x
> ---------------------------------------------------
>
> Key: SPARK-3062
> URL: https://issues.apache.org/jira/browse/SPARK-3062
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.0.2
> Reporter: Cheng Lian
> Priority: Blocker
> Fix For: 1.1.0
>
>
> PR [#1891|https://github.com/apache/spark/pull/1891] leverages {{ShutdownHookManager}} to avoid {{IOException}} when {{EventLogging}} is enabled. But unfortunately {{ShutdownHookManager}} is only available in Hadoop 2.x. Compilation fails when building Spark with Hadoop 1.
> {code}
> $ ./sbt/sbt -Phive-thriftserver
> ...
> [ERROR] /home/spark/software/source/compile/spark/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala:30: object ShutdownHookManager is not a member of package org.apache.hadoop.util
> [ERROR] import org.apache.hadoop.util.ShutdownHookManager
> [ERROR] ^
> [ERROR] /home/spark/software/source/compile/spark/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala:125: not found: value ShutdownHookManager
> [ERROR] ShutdownHookManager.get.addShutdownHook(
> [ERROR] ^
> [WARNING] one warning found
> [ERROR] two errors found
> {code}
--
This message was sent by Atlassian JIRA
(v6.2#6252)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org