You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Neelesh Srinivas Salian (JIRA)" <ji...@apache.org> on 2015/12/11 22:40:46 UTC

[jira] [Commented] (SPARK-12047) Unhelpful error messages generated by JavaDoc while doing sbt unidoc

    [ https://issues.apache.org/jira/browse/SPARK-12047?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15053614#comment-15053614 ] 

Neelesh Srinivas Salian commented on SPARK-12047:
-------------------------------------------------

Closing these since they are duplicated by the above mentioned JIRAs.
Thank you.

> Unhelpful error messages generated by JavaDoc while doing sbt unidoc
> --------------------------------------------------------------------
>
>                 Key: SPARK-12047
>                 URL: https://issues.apache.org/jira/browse/SPARK-12047
>             Project: Spark
>          Issue Type: Documentation
>          Components: Documentation
>    Affects Versions: 1.6.0
>            Reporter: Cheng Lian
>
> I'm not quite familiar with the internal mechanism of the SBT Unidoc plugin, but it seems that it tries to convert Scala files into Java files and then run {{javadoc}} over generated files to produces JavaDoc pages.
> During this process, {{javadoc}} keeps producing unhelpful error messages like:
> {noformat}
> [error] /Users/lian/local/src/spark/branch-1.6/mllib/target/java/org/apache/spark/ml/PredictionModel.java:16: error: unknown tag: group
> [error]   /** @group setParam */
> [error]       ^
> [error] /Users/lian/local/src/spark/branch-1.6/graphx/target/java/org/apache/spark/graphx/lib/PageRank.java:83: error: unknown tag: tparam
> [error]    * @tparam ED the original edge attribute (not used)
> [error]      ^
> [error] /Users/lian/local/src/spark/branch-1.6/core/target/java/org/apache/spark/ContextCleaner.java:76: error: BlockManagerMaster is not public in org.apache.spark.storage; cannot be accessed from outside package
> [error]   private  org.apache.spark.storage.BlockManagerMaster blockManagerMaster () { throw new RuntimeException(); }
> [error]                                    ^
> [error] /Users/lian/local/src/spark/branch-1.6/mllib/target/java/org/apache/spark/mllib/linalg/distributed/BlockMatrix.java:72: error: reference not found
> [error]    * if it is being added to a {@link DenseMatrix}. If two dense matrices are added, the output will
> [error]                                       ^
> {noformat}
> The {{scaladoc}} tool also produces tons of warning messages like:
> {noformat}
> [warn] /Users/lian/local/src/spark/branch-1.6/sql/core/src/main/scala/org/apache/spark/sql/Column.scala:1117: Could not find any member to link for "StructField".
> [warn]   /**
> [warn]   ^
> {noformat}
> (This one is probably because of [SI-3695|https://issues.scala-lang.org/browse/SI-3695] and [SI-8734|https://issues.scala-lang.org/browse/SI-8734]).
> The problem is that they covered the real problems, and bring difficulty for API doc auditing.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org