You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kent Yao (Jira)" <ji...@apache.org> on 2020/03/26 02:00:10 UTC

[jira] [Updated] (SPARK-31258) sbt unidoc fail to resolve arvo dependency

     [ https://issues.apache.org/jira/browse/SPARK-31258?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Kent Yao updated SPARK-31258:
-----------------------------
    Summary: sbt unidoc fail to resolve arvo dependency  (was: sbt unidoc fail to resolving arvo dependency)

> sbt unidoc fail to resolve arvo dependency
> ------------------------------------------
>
>                 Key: SPARK-31258
>                 URL: https://issues.apache.org/jira/browse/SPARK-31258
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build
>    Affects Versions: 3.1.0
>            Reporter: Kent Yao
>            Priority: Major
>
> {code:java}
> warn] Multiple main classes detected.  Run 'show discoveredMainClasses' to see the list
> [warn] Multiple main classes detected.  Run 'show discoveredMainClasses' to see the list
> [info] Main Scala API documentation to /home/jenkins/workspace/SparkPullRequestBuilder@6/target/scala-2.12/unidoc...
> [info] Main Java API documentation to /home/jenkins/workspace/SparkPullRequestBuilder@6/target/javaunidoc...
> [error] /home/jenkins/workspace/SparkPullRequestBuilder@6/core/src/main/scala/org/apache/spark/serializer/GenericAvroSerializer.scala:123: value createDatumWriter is not a member of org.apache.avro.generic.GenericData
> [error]     writerCache.getOrElseUpdate(schema, GenericData.get.createDatumWriter(schema))
> [error]                                                         ^
> [info] No documentation generated with unsuccessful compiler run
> [error] one error found
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org