You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Brian Jones (JIRA)" <ji...@apache.org> on 2018/10/15 21:30:00 UTC

[jira] [Created] (SPARK-25739) Double quote coming in as empty value even when emptyValue set as null

Brian Jones created SPARK-25739:
-----------------------------------

             Summary: Double quote coming in as empty value even when emptyValue set as null
                 Key: SPARK-25739
                 URL: https://issues.apache.org/jira/browse/SPARK-25739
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.3.2
         Environment:  Example code - 
{code:java}
val df = List((1,""),(2,"hello"),(3,"hi"),(4,null)).toDF("key","value")
df
.repartition(1)
.write
.mode("overwrite")
.option("nullValue", null)
.option("emptyValue", null)
.option("delimiter",",")
.option("quoteMode", "NONE")
.option("escape","\\")
.format("csv")
.save("/tmp/nullcsv/")

var out = dbutils.fs.ls("/tmp/nullcsv/")
var file = out(out.size - 1)
val x = dbutils.fs.head("/tmp/nullcsv/" + file.name)
println(x)
{code}
Output - 
{code:java}
1,""
3,hi
2,hello
4,
{code}
Expected output - 
{code:java}
1,
3,hi
2,hello
4,
{code}
 

[https://github.com/apache/spark/commit/b7efca7ece484ee85091b1b50bbc84ad779f9bfe] This commit is relevant to my issue.

"Since Spark 2.4, empty strings are saved as quoted empty strings `""`. In version 2.3 and earlier, empty strings are equal to `null` values and do not reflect to any characters in saved CSV files."

I am on Spark version 2.3.2, so empty strings should be coming as null.  Even then, I am passing the correct "emptyValue" option.  However, my empty values are stilling coming as `""` in the written file.

 
            Reporter: Brian Jones






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org