You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kengo Seki (JIRA)" <ji...@apache.org> on 2019/01/10 16:09:00 UTC
[jira] [Updated] (SPARK-26564) Fix wrong assertions and error
messages for parameter checking
[ https://issues.apache.org/jira/browse/SPARK-26564?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Kengo Seki updated SPARK-26564:
-------------------------------
Summary: Fix wrong assertions and error messages for parameter checking (was: Fix misleading error message about spark.network.timeout and spark.executor.heartbeatInterval)
> Fix wrong assertions and error messages for parameter checking
> --------------------------------------------------------------
>
> Key: SPARK-26564
> URL: https://issues.apache.org/jira/browse/SPARK-26564
> Project: Spark
> Issue Type: Bug
> Components: MLlib, Spark Core
> Affects Versions: 2.3.0, 2.3.1, 2.3.2, 2.4.0
> Reporter: Kengo Seki
> Priority: Minor
> Labels: starter
>
> I mistakenly set an equivalent value with spark.network.timeout to spark.executor.heartbeatInterval and got the following error:
> {code}
> java.lang.IllegalArgumentException: requirement failed: The value of spark.network.timeout=120s must be no less than the value of spark.executor.heartbeatInterval=120s.
> {code}
> But it can be read as they could be equal. "Greater than" is more precise than "no less than".
> ----
> In addition, the following assertions are inconsistent with their messages and the messages are right.
> {code:title=mllib/src/main/scala/org/apache/spark/ml/optim/WeightedLeastSquares.scala}
> 91 require(maxIter >= 0, s"maxIter must be a positive integer: $maxIter")
> {code}
> {code:title=sql/core/src/main/scala/org/apache/spark/sql/execution/joins/HashedRelation.scala}
> 416 require(capacity < 512000000, "Cannot broadcast more than 512 millions rows")
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org