You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Tim McNamara (Jira)" <ji...@apache.org> on 2021/04/28 01:50:00 UTC

[jira] [Updated] (SPARK-35250) SQL DataFrameReader unescapedQuoteHandling parameter is misdocumented

     [ https://issues.apache.org/jira/browse/SPARK-35250?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Tim McNamara updated SPARK-35250:
---------------------------------
    Summary: SQL DataFrameReader unescapedQuoteHandling parameter is misdocumented  (was: SQL DataFrameReader mode is misdocumented)

> SQL DataFrameReader unescapedQuoteHandling parameter is misdocumented
> ---------------------------------------------------------------------
>
>                 Key: SPARK-35250
>                 URL: https://issues.apache.org/jira/browse/SPARK-35250
>             Project: Spark
>          Issue Type: Documentation
>          Components: docs, Documentation
>    Affects Versions: 3.1.2
>         Environment:  
>            Reporter: Tim McNamara
>            Priority: Major
>              Labels: GoodForNewContributors, easy-fix
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> The unescapedQuoteHandling parameter of DataFrameReader isn't correctly documented. STOP_AT_DELIMITER appears twice, and it looks like that's overwritten an intended option, e.g. [https://github.com/apache/spark/blob/1d550c4e90275ab418b9161925049239227f3dc9/sql/core/src/main/scala/org/apache/spark/sql/DataFrameReader.scala#L744-L749]
> To view instances where this error occurs, this is a useful query [https://github.com/apache/spark/search?q=STOP_AT_DELIMITER]
> It appears that this bug was introduced here: [https://github.com/apache/spark/pull/30518|https://github.com/apache/spark/pull/30518,]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org