You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "angerszhu (Jira)" <ji...@apache.org> on 2019/10/08 06:18:00 UTC
[jira] [Commented] (SPARK-29379) SHOW FUNCTIONS don't show '!=',
'<>' , 'between', 'case'
[ https://issues.apache.org/jira/browse/SPARK-29379?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16946526#comment-16946526 ]
angerszhu commented on SPARK-29379:
-----------------------------------
Don't need to add new expression class.
If we just add code in ShowFunctionsCommand, we should change a lot UT about functions:
{code:java}
case class ShowFunctionsCommand(
db: Option[String],
pattern: Option[String],
showUserFunctions: Boolean,
showSystemFunctions: Boolean) extends RunnableCommand {
override val output: Seq[Attribute] = {
val schema = StructType(StructField("function", StringType, nullable = false) :: Nil)
schema.toAttributes
}
override def run(sparkSession: SparkSession): Seq[Row] = {
val dbName = db.getOrElse(sparkSession.sessionState.catalog.getCurrentDatabase)
// If pattern is not specified, we use '*', which is used to
// match any sequence of characters (including no characters).
val functionNames =
sparkSession.sessionState.catalog
.listFunctions(dbName, pattern.getOrElse("*"))
.collect {
case (f, "USER") if showUserFunctions => f.unquotedString
case (f, "SYSTEM") if showSystemFunctions => f.unquotedString
}
(functionNames ++ Seq("!=", "<>", "between", "case")).sorted.map(Row(_))
}
}
{code}
> SHOW FUNCTIONS don't show '!=', '<>' , 'between', 'case'
> --------------------------------------------------------
>
> Key: SPARK-29379
> URL: https://issues.apache.org/jira/browse/SPARK-29379
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.4.0, 3.0.0
> Reporter: angerszhu
> Priority: Major
>
> SHOW FUNCTIONS don't show '!=', '<>' , 'between', 'case'
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org