You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Bjørn Jørgensen (Jira)" <ji...@apache.org> on 2022/05/28 12:20:00 UTC

[jira] [Updated] (SPARK-39304) ps.read_csv ignore double quotes.

     [ https://issues.apache.org/jira/browse/SPARK-39304?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Bjørn Jørgensen updated SPARK-39304:
------------------------------------
    Summary: ps.read_csv ignore double quotes.  (was: ps.read_csv ignore commas in double quotes.)

> ps.read_csv ignore double quotes.
> ---------------------------------
>
>                 Key: SPARK-39304
>                 URL: https://issues.apache.org/jira/browse/SPARK-39304
>             Project: Spark
>          Issue Type: Bug
>          Components: Pandas API on Spark
>    Affects Versions: 3.4.0
>            Reporter: Bjørn Jørgensen
>            Priority: Major
>         Attachments: Untitled (4).ipynb, csvfile.csv
>
>
> This one is coming from user@spark.org mail list tittle "Complexity with the data" and also on [SO|https://stackoverflow.com/questions/72389385/how-to-load-complex-data-using-pyspark] 
> Add a notebook and the sample data, where this error is tested. 
> Test data :
> Some years,"If your job title needs additional context, please clarify here:","If ""Other,"" please indicate the currency here: "
> 5-7 years,"I started as the Marketing Coordinator, and was given the ""Associate Product Manager"" title as a promotion. My duties remained mostly the same and include graphic design work, marketing, and product management.",
> 8 - 10 years,equivalent to Assistant Registrar,
> 2 - 4 years,"I manage our fundraising department, primarily overseeing our direct mail, planned giving, and grant writing programs. ",



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org