You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "L. C. Hsieh (Jira)" <ji...@apache.org> on 2020/09/06 01:09:00 UTC
[jira] [Updated] (SPARK-32780) Fill since fields for all the
expressions
[ https://issues.apache.org/jira/browse/SPARK-32780?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
L. C. Hsieh updated SPARK-32780:
--------------------------------
Labels: beginner (was: )
> Fill since fields for all the expressions
> -----------------------------------------
>
> Key: SPARK-32780
> URL: https://issues.apache.org/jira/browse/SPARK-32780
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 3.1.0
> Reporter: Takeshi Yamamuro
> Priority: Major
> Labels: beginner
>
> Some since files in ExpressionDescription are missing now, it is worth filling them to make documents better;
> {code:java}
> test("Since has a valid value") {
> val badExpressions = spark.sessionState.functionRegistry.listFunction()
> .map(spark.sessionState.catalog.lookupFunctionInfo)
> .filter(funcInfo => !funcInfo.getSince.matches("[0-9]+\\.[0-9]+\\.[0-9]+"))
> .map(_.getClassName)
> .distinct
> .sorted
> if (badExpressions.nonEmpty) {
> fail(s"${badExpressions.length} expressions with invalid 'since':\n"
> + badExpressions.mkString("\n"))
> }
> }
> [info] - Since has a valid value *** FAILED *** (16 milliseconds)
> [info] 67 expressions with invalid 'since':
> [info] org.apache.spark.sql.catalyst.expressions.Abs
> [info] org.apache.spark.sql.catalyst.expressions.Add
> [info] org.apache.spark.sql.catalyst.expressions.And
> [info] org.apache.spark.sql.catalyst.expressions.ArrayContains
> [info] org.apache.spark.sql.catalyst.expressions.AssertTrue
> [info] org.apache.spark.sql.catalyst.expressions.BitwiseAnd
> [info] org.apache.spark.sql.catalyst.expressions.BitwiseNot
> [info] org.apache.spark.sql.catalyst.expressions.BitwiseOr
> [info] org.apache.spark.sql.catalyst.expressions.BitwiseXor
> [info] org.apache.spark.sql.catalyst.expressions.CallMethodViaReflection
> [info] org.apache.spark.sql.catalyst.expressions.CaseWhen
> [info] org.apache.spark.sql.catalyst.expressions.Cast
> [info] org.apache.spark.sql.catalyst.expressions.Concat
> [info] org.apache.spark.sql.catalyst.expressions.Crc32
> [info] org.apache.spark.sql.catalyst.expressions.CreateArray
> [info] org.apache.spark.sql.catalyst.expressions.CreateMap
> [info] org.apache.spark.sql.catalyst.expressions.CreateNamedStruct
> [info] org.apache.spark.sql.catalyst.expressions.CurrentDatabase
> [info] org.apache.spark.sql.catalyst.expressions.Divide
> [info] org.apache.spark.sql.catalyst.expressions.EqualNullSafe
> [info] org.apache.spark.sql.catalyst.expressions.EqualTo
> [info] org.apache.spark.sql.catalyst.expressions.Explode
> [info] org.apache.spark.sql.catalyst.expressions.GetJsonObject
> [info] org.apache.spark.sql.catalyst.expressions.GreaterThan
> [info] org.apache.spark.sql.catalyst.expressions.GreaterThanOrEqual
> [info] org.apache.spark.sql.catalyst.expressions.Greatest
> [info] org.apache.spark.sql.catalyst.expressions.If
> [info] org.apache.spark.sql.catalyst.expressions.In
> [info] org.apache.spark.sql.catalyst.expressions.Inline
> [info] org.apache.spark.sql.catalyst.expressions.InputFileBlockLength
> [info] org.apache.spark.sql.catalyst.expressions.InputFileBlockStart
> [info] org.apache.spark.sql.catalyst.expressions.InputFileName
> [info] org.apache.spark.sql.catalyst.expressions.JsonTuple
> [info] org.apache.spark.sql.catalyst.expressions.Least
> [info] org.apache.spark.sql.catalyst.expressions.LessThan
> [info] org.apache.spark.sql.catalyst.expressions.LessThanOrEqual
> [info] org.apache.spark.sql.catalyst.expressions.MapKeys
> [info] org.apache.spark.sql.catalyst.expressions.MapValues
> [info] org.apache.spark.sql.catalyst.expressions.Md5
> [info] org.apache.spark.sql.catalyst.expressions.MonotonicallyIncreasingID
> [info] org.apache.spark.sql.catalyst.expressions.Multiply
> [info] org.apache.spark.sql.catalyst.expressions.Murmur3Hash
> [info] org.apache.spark.sql.catalyst.expressions.Not
> [info] org.apache.spark.sql.catalyst.expressions.Or
> [info] org.apache.spark.sql.catalyst.expressions.Overlay
> [info] org.apache.spark.sql.catalyst.expressions.Pmod
> [info] org.apache.spark.sql.catalyst.expressions.PosExplode
> [info] org.apache.spark.sql.catalyst.expressions.Remainder
> [info] org.apache.spark.sql.catalyst.expressions.Sha1
> [info] org.apache.spark.sql.catalyst.expressions.Sha2
> [info] org.apache.spark.sql.catalyst.expressions.Size
> [info] org.apache.spark.sql.catalyst.expressions.SortArray
> [info] org.apache.spark.sql.catalyst.expressions.SparkPartitionID
> [info] org.apache.spark.sql.catalyst.expressions.Stack
> [info] org.apache.spark.sql.catalyst.expressions.Subtract
> [info] org.apache.spark.sql.catalyst.expressions.TimeWindow
> [info] org.apache.spark.sql.catalyst.expressions.UnaryMinus
> [info] org.apache.spark.sql.catalyst.expressions.UnaryPositive
> [info] org.apache.spark.sql.catalyst.expressions.Uuid
> [info] org.apache.spark.sql.catalyst.expressions.xml.XPathBoolean
> [info] org.apache.spark.sql.catalyst.expressions.xml.XPathDouble
> [info] org.apache.spark.sql.catalyst.expressions.xml.XPathFloat
> [info] org.apache.spark.sql.catalyst.expressions.xml.XPathInt
> [info] org.apache.spark.sql.catalyst.expressions.xml.XPathList
> [info] org.apache.spark.sql.catalyst.expressions.xml.XPathLong
> [info] org.apache.spark.sql.catalyst.expressions.xml.XPathShort
> [info] org.apache.spark.sql.catalyst.expressions.xml.XPathString (ExpressionInfoSuite.scala:204)
> {code}
> This was checked by tanelk: https://github.com/apache/spark/pull/29577#discussion_r479794502
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org