You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by sk skk <sp...@gmail.com> on 2017/12/29 17:19:29 UTC
Custom line/record delimiter
Hi,
Do we have an option to write a csv or text file with a custom record/line
separator through spark ?
I could not find any ref on the api. I have a issue while loading data into
a warehouse as one of the column on csv have a new line character and the
warehouse is not letting to escape that new line character .
Thank you ,
Sk
Re: Custom line/record delimiter
Posted by sk skk <sp...@gmail.com>.
Thanks for the update Kwon.
Regards,
On Mon, Jan 1, 2018 at 7:54 PM Hyukjin Kwon <gu...@gmail.com> wrote:
> Hi,
>
>
> There's a PR - https://github.com/apache/spark/pull/18581 and JIRA
> - SPARK-21289
>
> Alternatively, you could check out multiLine option for CSV and see if
> applicable.
>
>
> Thanks.
>
>
> 2017-12-30 2:19 GMT+09:00 sk skk <sp...@gmail.com>:
>
>> Hi,
>>
>> Do we have an option to write a csv or text file with a custom
>> record/line separator through spark ?
>>
>> I could not find any ref on the api. I have a issue while loading data
>> into a warehouse as one of the column on csv have a new line character and
>> the warehouse is not letting to escape that new line character .
>>
>> Thank you ,
>> Sk
>>
>
>
Re: Custom line/record delimiter
Posted by Hyukjin Kwon <gu...@gmail.com>.
Hi,
There's a PR - https://github.com/apache/spark/pull/18581 and JIRA
- SPARK-21289
Alternatively, you could check out multiLine option for CSV and see if
applicable.
Thanks.
2017-12-30 2:19 GMT+09:00 sk skk <sp...@gmail.com>:
> Hi,
>
> Do we have an option to write a csv or text file with a custom record/line
> separator through spark ?
>
> I could not find any ref on the api. I have a issue while loading data
> into a warehouse as one of the column on csv have a new line character and
> the warehouse is not letting to escape that new line character .
>
> Thank you ,
> Sk
>