You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2021/02/05 02:30:00 UTC
[jira] [Commented] (SPARK-34373) HiveThriftServer2 startWithContext
may hang with a race issue
[ https://issues.apache.org/jira/browse/SPARK-34373?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17279279#comment-17279279 ]
Apache Spark commented on SPARK-34373:
--------------------------------------
User 'yaooqinn' has created a pull request for this issue:
https://github.com/apache/spark/pull/31479
> HiveThriftServer2 startWithContext may hang with a race issue
> --------------------------------------------------------------
>
> Key: SPARK-34373
> URL: https://issues.apache.org/jira/browse/SPARK-34373
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.0.1, 3.1.0
> Reporter: Kent Yao
> Priority: Major
>
> ```
> 21:43:26.809 WARN org.apache.thrift.server.TThreadPoolServer: Transport error occurred during acceptance of message.
> org.apache.thrift.transport.TTransportException: No underlying server socket.
> at org.apache.thrift.transport.TServerSocket.acceptImpl(TServerSocket.java:126)
> at org.apache.thrift.transport.TServerSocket.acceptImpl(TServerSocket.java:35)
> at org.apache.thrift.transport.TServerTransport.acceException in thread "Thread-15" java.io.IOException: Stream closed
> at java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:170)
> at java.io.BufferedInputStream.read(BufferedInputStream.java:336)
> at java.io.FilterInputStream.read(FilterInputStream.java:107)
> at scala.sys.process.BasicIO$.loop$1(BasicIO.scala:238)
> at scala.sys.process.BasicIO$.transferFullyImpl(BasicIO.scala:246)
> at scala.sys.process.BasicIO$.transferFully(BasicIO.scala:227)
> at scala.sys.process.BasicIO$.$anonfun$toStdOut$1(BasicIO.scala:221)
> ```
> the TServer might try to serve even the stop is called
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org