You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/04/22 11:17:00 UTC
[jira] [Reopened] (SPARK-27512) Decimal parsing leads to unexpected
type inference
[ https://issues.apache.org/jira/browse/SPARK-27512?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon reopened SPARK-27512:
----------------------------------
> Decimal parsing leads to unexpected type inference
> --------------------------------------------------
>
> Key: SPARK-27512
> URL: https://issues.apache.org/jira/browse/SPARK-27512
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.0.0
> Environment: spark 3.0.0-SNAPSHOT from this commit:
> {code:bash}
> commit 3ab96d7acf870e53c9016b0b63d0b328eec23bed
> Author: Dilip Biswal <db...@us.ibm.com>
> Date: Mon Apr 15 21:26:45 2019 +0800
> {code}
> Reporter: koert kuipers
> Priority: Minor
>
> {code:bash}
> $ hadoop fs -text test.bsv
> x|y
> 1|1,2
> 2|2,3
> 3|3,4
> {code}
> in spark 2.4.1:
> {code:bash}
> scala> val data = spark.read.format("csv").option("header", true).option("delimiter", "|").option("inferSchema", true).load("test.bsv")
> scala> data.printSchema
> root
> |-- x: integer (nullable = true)
> |-- y: string (nullable = true)
> scala> data.show
> +---+---+
> | x| y|
> +---+---+
> | 1|1,2|
> | 2|2,3|
> | 3|3,4|
> +---+---+
> {code}
> in spark 3.0.0-SNAPSHOT:
> {code:bash}
> scala> val data = spark.read.format("csv").option("header", true).option("delimiter", "|").option("inferSchema", true).load("test.bsv")
> scala> data.printSchema
> root
> |-- x: integer (nullable = true)
> |-- y: decimal(2,0) (nullable = true)
> scala> data.show
> +---+---+
> | x| y|
> +---+---+
> | 1| 12|
> | 2| 23|
> | 3| 34|
> +---+---+
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org