You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sergey (JIRA)" <ji...@apache.org> on 2016/04/15 19:17:25 UTC

[jira] [Commented] (SPARK-14663) Parse escape sequences in spark-defaults.conf

    [ https://issues.apache.org/jira/browse/SPARK-14663?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15243244#comment-15243244 ] 

Sergey commented on SPARK-14663:
--------------------------------

UPDATE: I have found a different solution for my particular issue (new APIHadoopFile happens to work with zip files even if passed "org.apache.hadoop.mapreduce.lib.input.TextInputFormat", not "com.cotdp.hadoop.ZipFileInputFormat"). 

However, the issue remains that a value in spark-defaults.conf cannot be set to "\n".

> Parse escape sequences in spark-defaults.conf
> ---------------------------------------------
>
>                 Key: SPARK-14663
>                 URL: https://issues.apache.org/jira/browse/SPARK-14663
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.6.1
>            Reporter: Sergey
>
> I am trying to specify 
> spark.hadoop.textinputformat.record.delimiter in spark-defaults.conf, namely, to set it to "\n" (the #10 character). I know how to do it in sc.newAPIHadoopFile, but I'd like to set it in configuration, so I can keep using sc.textFile (because it also works with zipped files). 
> However, I can't find a way to accomplish it. 
> I have tried
> spark.hadoop.textinputformat.record.delimiter \n
> spark.hadoop.textinputformat.record.delimiter '\n'
> spark.hadoop.textinputformat.record.delimiter "\n"
> spark.hadoop.textinputformat.record.delimiter \\n   (that's two slashes and the letter n)
> spark.hadoop.textinputformat.record.delimiter   
> (just pressing enter)
> None of them works. I check in sc._conf.getAll(), and none of them gives me the right result.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org