You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Gengliang Wang (Jira)" <ji...@apache.org> on 2019/12/07 22:38:00 UTC
[jira] [Created] (SPARK-30164) Fix document generation in "sbt
unidoc"
Gengliang Wang created SPARK-30164:
--------------------------------------
Summary: Fix document generation in "sbt unidoc"
Key: SPARK-30164
URL: https://issues.apache.org/jira/browse/SPARK-30164
Project: Spark
Issue Type: Bug
Components: Build, Documentation
Affects Versions: 3.0.0
Reporter: Gengliang Wang
Assignee: Gengliang Wang
In the latest master branch, the document generation command:
{code:java}
./build/sbt -Phadoop-2.7 -Phive-2.3 -Pyarn -Phive -Pmesos -Pkinesis-asl -Pspark-ganglia-lgpl -Pkubernetes -Phadoop-cloud -Phive-thriftserver unidoc
{code}
failed with such message
{code:java}
[error] /Users/gengliang.wang/Downloads/spark/sql/hive-thriftserver/v2.3/src/main/java/org/apache/hive/service/cli/thrift/ThriftCLIService.java:248: error: incompatible types: org.apache.hive.service.rpc.thrift.TSessionHandle cannot be converted to org.apache.hive.service.cli.thrift.TSessionHandle
[error] resp.setSessionHandle(sessionHandle.toTSessionHandle());
[error] ^
[error] /Users/gengliang.wang/Downloads/spark/sql/hive-thriftserver/v2.3/src/main/java/org/apache/hive/service/cli/thrift/ThriftCLIService.java:259: error: incompatible types: org.apache.hive.service.rpc.thrift.TStatus cannot be converted to org.apache.hive.service.cli.thrift.TStatus
[error] resp.setStatus(HiveSQLException.toTStatus(e));
[error] ^
[error] /Users/gengliang.wang/Downloads/spark/sql/hive-thriftserver/v2.3/src/main/java/org/apache/hive/service/cli/thrift/ThriftCLIService.java:346: error: method getMinVersion in class ThriftCLIService cannot be applied to given types;
[error] TProtocolVersion protocol = getMinVersion(CLIService.SERVER_VERSION,
[error]
{code}
To fix it, we should add change "sbt unidoc" to "sbt clean unidoc"
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org