You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "pavithra ramachandran (JIRA)" <ji...@apache.org> on 2019/07/11 11:21:00 UTC

[jira] [Commented] (SPARK-28338) spark.read.format("csv") treat empty string as null if csv file don't have quotes in data

    [ https://issues.apache.org/jira/browse/SPARK-28338?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16882872#comment-16882872 ] 

pavithra ramachandran commented on SPARK-28338:
-----------------------------------------------

Could you given more information, as to what is your exact expectation by giving an  example ?

> spark.read.format("csv") treat empty string as null if csv file don't have quotes in data
> -----------------------------------------------------------------------------------------
>
>                 Key: SPARK-28338
>                 URL: https://issues.apache.org/jira/browse/SPARK-28338
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.3
>            Reporter: Jayadevan M
>            Priority: Major
>
> The csv input file
> +cat sample.csv+ 
>  Name,Lastname,Age
>  abc,,32
>  pqr,xxx,30
>  
> +spark-shell+
> spark.read.format("csv").option("header", "true").load("/media/ub_share/projects/*.csv").head(3)
>  res14: Array[org.apache.spark.sql.Row] = Array([abc,null,32], [pqr,xxx,30])
>  
> scala> spark.read.format("csv").option("header", "true").option("nullValue", "?").load("/media/ub_share/projects/*.csv").head(3)
>  res15: Array[org.apache.spark.sql.Row] = Array([abc,null,32], [pqr,xxx,30])
>  
> The empty string get converted to null. Its works fine if the csv file have quotes in columns.



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org