You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "M. Manna (Jira)" <ji...@apache.org> on 2022/08/30 23:30:00 UTC
[jira] [Updated] (SPARK-40282) DataType argument in StructType.add is incorrectly throwing scala.MatchError
[ https://issues.apache.org/jira/browse/SPARK-40282?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
M. Manna updated SPARK-40282:
-----------------------------
Summary: DataType argument in StructType.add is incorrectly throwing scala.MatchError (was: IntegerType is missed in "ExternalDataTypeForInput" function)
> DataType argument in StructType.add is incorrectly throwing scala.MatchError
> ----------------------------------------------------------------------------
>
> Key: SPARK-40282
> URL: https://issues.apache.org/jira/browse/SPARK-40282
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 3.3.0
> Reporter: M. Manna
> Priority: Blocker
> Attachments: SparkApplication.kt, retailstore.csv
>
>
> *Problem Description*
> as part of contract mentioned here, Spark should be able to support {{IntegerType}} as an argument in StructType.add method. However, it complaints with {{scala.MatchError}} today.
>
> If we call the override version which access String value as Type e.g. "Integer" - it works.
> *How to Reproduce*
> # Create a Kotlin Project - I have used Kotlin but Java will also work (needs minor adjustment)
> # Place the attached CSV file in {{src/main/resources}}
> # Compile the project with Java 11
> # Run - it will give you error.
> # Now change line (commented as HERE) - to have a String value i.e. "Integer"
> # It works
> *Ask*
> # Why does it not accept IntegerType, StringType as DataType as part of the parameters supplied through {{add}} function in {{StructType}} ?
> # If this is a bug, do we know when the fix can come?
>
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org