You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/01/08 03:27:00 UTC
[jira] [Resolved] (SPARK-26567) Should we align CSV query results
with hive text query results: an int field, if the input value is 1.0, hive
text query results is 1, CSV query results is null
[ https://issues.apache.org/jira/browse/SPARK-26567?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-26567.
----------------------------------
Resolution: Won't Fix
> Should we align CSV query results with hive text query results: an int field, if the input value is 1.0, hive text query results is 1, CSV query results is null
> ----------------------------------------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-26567
> URL: https://issues.apache.org/jira/browse/SPARK-26567
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 2.4.0
> Reporter: eaton
> Priority: Minor
>
> If we want to be consistent, we can modify the makeConverter function in UnivocityParser, but the performance may get worse.The modified code is as follows:
>
> {code:java}
> def makeConverter(
> name: String,
> dataType: DataType,
> nullable: Boolean = true,
> options: CSVOptions): ValueConverter = dataType match {
> case _: ByteType => (d: String) =>
> nullSafeDatum(d, name, nullable, options)(_.toDouble.intValue().toByte)
> case _: ShortType => (d: String) =>
> nullSafeDatum(d, name, nullable, options)(_.toDouble.intValue().toShort)
> case _: IntegerType => (d: String) =>
> nullSafeDatum(d, name, nullable, options)(_.toDouble.intValue())
> case _: LongType => (d: String) =>
> nullSafeDatum(d, name, nullable, options)(_.toDouble.intValue().toLong)
> {code}
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org