You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2014/11/11 11:50:34 UTC

[jira] [Commented] (SPARK-4326) unidoc is broken on master

    [ https://issues.apache.org/jira/browse/SPARK-4326?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14206265#comment-14206265 ] 

Sean Owen commented on SPARK-4326:
----------------------------------

Hm. {{hashInt}} isn't in Guava 11, but is in 12. This leads me to believe that unidoc is picking up Guava 11 from Hadoop, and not Guava 14 from Spark since it's shaded. I would like to phone a friend: [~vanzin]

> unidoc is broken on master
> --------------------------
>
>                 Key: SPARK-4326
>                 URL: https://issues.apache.org/jira/browse/SPARK-4326
>             Project: Spark
>          Issue Type: Bug
>          Components: Build, Documentation
>    Affects Versions: 1.3.0
>            Reporter: Xiangrui Meng
>
> On master, `jekyll build` throws the following error:
> {code}
> [error] /Users/meng/src/spark/core/src/main/scala/org/apache/spark/util/collection/AppendOnlyMap.scala:205: value hashInt is not a member of com.google.common.hash.HashFunction
> [error]   private def rehash(h: Int): Int = Hashing.murmur3_32().hashInt(h).asInt()
> [error]                                                          ^
> [error] /Users/meng/src/spark/core/src/main/scala/org/apache/spark/util/collection/ExternalAppendOnlyMap.scala:426: value limit is not a member of object com.google.common.io.ByteStreams
> [error]         val bufferedStream = new BufferedInputStream(ByteStreams.limit(fileStream, end - start))
> [error]                                                                  ^
> [error] /Users/meng/src/spark/core/src/main/scala/org/apache/spark/util/collection/ExternalSorter.scala:558: value limit is not a member of object com.google.common.io.ByteStreams
> [error]         val bufferedStream = new BufferedInputStream(ByteStreams.limit(fileStream, end - start))
> [error]                                                                  ^
> [error] /Users/meng/src/spark/core/src/main/scala/org/apache/spark/util/collection/OpenHashSet.scala:261: value hashInt is not a member of com.google.common.hash.HashFunction
> [error]   private def hashcode(h: Int): Int = Hashing.murmur3_32().hashInt(h).asInt()
> [error]                                                            ^
> [error] /Users/meng/src/spark/core/src/main/scala/org/apache/spark/util/collection/Utils.scala:37: type mismatch;
> [error]  found   : java.util.Iterator[T]
> [error]  required: Iterable[?]
> [error]     collectionAsScalaIterable(ordering.leastOf(asJavaIterator(input), num)).iterator
> [error]                                                              ^
> [error] /Users/meng/src/spark/sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetTableOperations.scala:421: value putAll is not a member of com.google.common.cache.Cache[org.apache.hadoop.fs.FileStatus,parquet.hadoop.Footer]
> [error]           footerCache.putAll(newFooters)
> [error]                       ^
> [warn] /Users/meng/src/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/parquet/FakeParquetSerDe.scala:34: @deprecated now takes two arguments; see the scaladoc.
> [warn] @deprecated("No code should depend on FakeParquetHiveSerDe as it is only intended as a " +
> [warn]  ^
> [info] No documentation generated with unsucessful compiler run
> [warn] two warnings found
> [error] 6 errors found
> [error] (spark/scalaunidoc:doc) Scaladoc generation failed
> [error] Total time: 48 s, completed Nov 10, 2014 1:31:01 PM
> {code}
> It doesn't happen on branch-1.2.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org