You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean R. Owen (Jira)" <ji...@apache.org> on 2019/10/27 00:57:00 UTC

[jira] [Updated] (SPARK-28817) Support standard Javadoc packaging to allow automatic javadoc location settings

     [ https://issues.apache.org/jira/browse/SPARK-28817?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean R. Owen updated SPARK-28817:
---------------------------------
    Environment: Sure, if you can figure out how to enable deploying of javadoc artifacts with the release, that'd do it. I think they're already built of course, just not part of what gets pushed. From a quick look at the build, I don't see what if anything would disable those artifacts, and I see them built locally. The copy script also seems to take all .jar files. It's a legit enhancement just not something I immediately see how to fix.
     Issue Type: Improvement  (was: Bug)
       Priority: Minor  (was: Major)

> Support standard Javadoc packaging to allow automatic javadoc location settings
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-28817
>                 URL: https://issues.apache.org/jira/browse/SPARK-28817
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 2.4.3
>         Environment: Sure, if you can figure out how to enable deploying of javadoc artifacts with the release, that'd do it. I think they're already built of course, just not part of what gets pushed. From a quick look at the build, I don't see what if anything would disable those artifacts, and I see them built locally. The copy script also seems to take all .jar files. It's a legit enhancement just not something I immediately see how to fix.
>            Reporter: Marc Le Bihan
>            Priority: Minor
>
> Currently Spark javadoc is only accessible here : [http://spark.apache.org/docs/latest/api/java]
> Maven isn't able to find it, through a _maven dependency:resolve -Dclassifier=javadoc_ and Eclipse, for example, isn't able to set javadoc location automatically to good location.
> Therefore, each developper
> in each sub-project of his project that uses Spark,
> must edit each jar related to spark (about ten)
> and set manually the http location of the javadoc.
> Now 99% of the API available on Maven respect the standard of delivering a separate downloadable javadoc through the javadoc classifer.
> Can Spark respect this standard too ?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org